2026-03-31T20:18:42.203 INFO:root:teuthology version: 1.2.4.dev37+ga59626679 2026-03-31T20:18:42.209 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-31T20:18:42.227 INFO:teuthology.run:Config: archive_path: /archive/kyr-2026-03-31_11:18:10-rados-tentacle-none-default-vps/4344 branch: tentacle description: rados/singleton-bluestore/{all/cephtool mon_election/connectivity msgr-failures/none msgr/async-v2only objectstore/bluestore/{alloc$/{avl} base mem$/{normal-2} onode-segment$/{1M} write$/{v2/{compr$/{yes$/{lz4}} v2}}} rados supported-random-distro$/{ubuntu_latest}} email: null first_in_suite: false flavor: default job_id: '4344' ktype: distro last_in_suite: false machine_type: vps name: kyr-2026-03-31_11:18:10-rados-tentacle-none-default-vps no_nested_subset: false openstack: - volumes: count: 3 size: 10 os_type: ubuntu os_version: '22.04' overrides: admin_socket: branch: tentacle ansible.cephlab: branch: main repo: https://github.com/kshtsk/ceph-cm-ansible.git skip_tags: nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs vars: logical_volumes: lv_1: scratch_dev: true size: 25%VG vg: vg_nvme lv_2: scratch_dev: true size: 25%VG vg: vg_nvme lv_3: scratch_dev: true size: 25%VG vg: vg_nvme lv_4: scratch_dev: true size: 25%VG vg: vg_nvme timezone: UTC volume_groups: vg_nvme: pvs: /dev/vdb,/dev/vdc,/dev/vdd,/dev/vde ceph: conf: global: mon election default strategy: 3 ms bind msgr1: false ms bind msgr2: true ms type: async mgr: debug mgr: 20 debug ms: 1 mon: debug mon: 20 debug ms: 1 debug paxos: 20 mon scrub interval: 300 osd: bdev async discard: true bdev enable discard: true bluestore allocator: avl bluestore block size: 96636764160 bluestore compression algorithm: lz4 bluestore compression mode: aggressive bluestore fsck on mount: true bluestore onode segment size: 1024K bluestore write v2: true bluestore zero block detection: true debug bluefs: 20 debug bluestore: 20 debug ms: 1 debug osd: 20 debug rocksdb: 10 mon osd backfillfull_ratio: 0.85 mon osd full ratio: 0.9 mon osd nearfull ratio: 0.8 osd debug verify cached snaps: true osd debug verify missing on start: true osd failsafe full ratio: 0.95 osd mclock iops capacity threshold hdd: 49000 osd mclock override recovery settings: true osd mclock profile: high_recovery_ops osd mclock skip benchmark: true osd objectstore: bluestore osd op queue: debug_random osd op queue cut off: debug_random osd_mclock_skip_benchmark: true flavor: default fs: xfs log-ignorelist: - \(MDS_ALL_DOWN\) - \(MDS_UP_LESS_THAN_MAX\) sha1: 5bb3278730741031382ca9c3dc9d221a942e06a2 ceph-deploy: conf: client: log file: /var/log/ceph/ceph-$name.$pid.log mon: {} cephadm: cephadm_binary_url: https://download.ceph.com/rpm-20.2.0/el9/noarch/cephadm install: ceph: flavor: default sha1: 5bb3278730741031382ca9c3dc9d221a942e06a2 extra_system_packages: deb: - python3-jmespath - python3-xmltodict - s3cmd rpm: - bzip2 - perl-Test-Harness - python3-jmespath - python3-xmltodict - s3cmd thrashosds: bdev_inject_crash: 2 bdev_inject_crash_probability: 0.5 workunit: branch: tt-tentacle sha1: 0392f78529848ec72469e8e431875cb98d3a5fb4 owner: kyr priority: 1000 repo: https://github.com/ceph/ceph.git roles: - - mon.a - mon.b - mon.c - mgr.x - osd.0 - osd.1 - osd.2 - client.0 seed: 6407 sha1: 5bb3278730741031382ca9c3dc9d221a942e06a2 sleep_before_teardown: 0 subset: 1/100000 suite: rados suite_branch: tt-tentacle suite_path: /home/teuthos/src/github.com_kshtsk_ceph_0392f78529848ec72469e8e431875cb98d3a5fb4/qa suite_relpath: qa suite_repo: https://github.com/kshtsk/ceph.git suite_sha1: 0392f78529848ec72469e8e431875cb98d3a5fb4 targets: vm03.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJoxb4u06Bk6dqJOB3WKNas+bv6SaRmj8p0RqqE/tvYpF5y2y7o+v+bN/kaQLQPm2kj/uk0WUkoXH7Dw2J1ZUqM= tasks: - install: null - ceph: log-ignorelist: - but it is still running - had wrong client addr - had wrong cluster addr - must scrub before tier agent can activate - failsafe engaged, dropping updates - failsafe disengaged, no longer dropping updates - overall HEALTH_ - \(OSDMAP_FLAGS\) - \(OSD_ - \(PG_ - \(SMALLER_PG_NUM\) - \(SMALLER_PGP_NUM\) - \(CACHE_POOL_NO_HIT_SET\) - \(CACHE_POOL_NEAR_FULL\) - \(FS_WITH_FAILED_MDS\) - \(FS_DEGRADED\) - \(POOL_BACKFILLFULL\) - \(POOL_FULL\) - \(SMALLER_PGP_NUM\) - \(POOL_NEARFULL\) - \(POOL_APP_NOT_ENABLED\) - \(AUTH_BAD_CAPS\) - \(FS_INLINE_DATA_DEPRECATED\) - \(MON_DOWN\) - \(SLOW_OPS\) - slow request - workunit: clients: all: - cephtool - mon/pool_ops.sh teuthology: fragments_dropped: [] meta: {} postmerge: [] teuthology_branch: uv2 teuthology_repo: https://github.com/kshtsk/teuthology teuthology_sha1: a59626679648f962bca99d20d35578f2998c8f37 timestamp: 2026-03-31_11:18:10 tube: vps user: kyr verbose: false worker_log: /home/teuthos/.teuthology/dispatcher/dispatcher.vps.282426 2026-03-31T20:18:42.227 INFO:teuthology.run:suite_path is set to /home/teuthos/src/github.com_kshtsk_ceph_0392f78529848ec72469e8e431875cb98d3a5fb4/qa; will attempt to use it 2026-03-31T20:18:42.228 INFO:teuthology.run:Found tasks at /home/teuthos/src/github.com_kshtsk_ceph_0392f78529848ec72469e8e431875cb98d3a5fb4/qa/tasks 2026-03-31T20:18:42.228 INFO:teuthology.run_tasks:Running task internal.check_packages... 2026-03-31T20:18:42.228 INFO:teuthology.task.internal:Checking packages... 2026-03-31T20:18:42.228 INFO:teuthology.task.internal:Checking packages for os_type 'ubuntu', flavor 'default' and ceph hash '5bb3278730741031382ca9c3dc9d221a942e06a2' 2026-03-31T20:18:42.228 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-31T20:18:42.228 INFO:teuthology.packaging:ref: None 2026-03-31T20:18:42.228 INFO:teuthology.packaging:tag: None 2026-03-31T20:18:42.228 INFO:teuthology.packaging:branch: tentacle 2026-03-31T20:18:42.228 INFO:teuthology.packaging:sha1: 5bb3278730741031382ca9c3dc9d221a942e06a2 2026-03-31T20:18:42.228 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&ref=tentacle 2026-03-31T20:18:43.083 INFO:teuthology.task.internal:Found packages for ceph version 20.2.0-714-g147f7c6a-1jammy 2026-03-31T20:18:43.083 INFO:teuthology.run_tasks:Running task internal.buildpackages_prep... 2026-03-31T20:18:43.084 INFO:teuthology.task.internal:no buildpackages task found 2026-03-31T20:18:43.084 INFO:teuthology.run_tasks:Running task internal.save_config... 2026-03-31T20:18:43.084 INFO:teuthology.task.internal:Saving configuration 2026-03-31T20:18:43.090 INFO:teuthology.run_tasks:Running task internal.check_lock... 2026-03-31T20:18:43.091 INFO:teuthology.task.internal.check_lock:Checking locks... 2026-03-31T20:18:43.097 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm03.local', 'description': '/archive/kyr-2026-03-31_11:18:10-rados-tentacle-none-default-vps/4344', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'ubuntu', 'os_version': '22.04', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-31 20:17:58.383497', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:03', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJoxb4u06Bk6dqJOB3WKNas+bv6SaRmj8p0RqqE/tvYpF5y2y7o+v+bN/kaQLQPm2kj/uk0WUkoXH7Dw2J1ZUqM='} 2026-03-31T20:18:43.098 INFO:teuthology.run_tasks:Running task internal.add_remotes... 2026-03-31T20:18:43.098 INFO:teuthology.task.internal:roles: ubuntu@vm03.local - ['mon.a', 'mon.b', 'mon.c', 'mgr.x', 'osd.0', 'osd.1', 'osd.2', 'client.0'] 2026-03-31T20:18:43.098 INFO:teuthology.run_tasks:Running task console_log... 2026-03-31T20:18:43.105 DEBUG:teuthology.task.console_log:vm03 does not support IPMI; excluding 2026-03-31T20:18:43.105 DEBUG:teuthology.exit:Installing handler: Handler(exiter=, func=.kill_console_loggers at 0x7f44a20ae290>, signals=[15]) 2026-03-31T20:18:43.105 INFO:teuthology.run_tasks:Running task internal.connect... 2026-03-31T20:18:43.105 INFO:teuthology.task.internal:Opening connections... 2026-03-31T20:18:43.106 DEBUG:teuthology.task.internal:connecting to ubuntu@vm03.local 2026-03-31T20:18:43.106 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm03.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-31T20:18:43.183 INFO:teuthology.run_tasks:Running task internal.push_inventory... 2026-03-31T20:18:43.184 DEBUG:teuthology.orchestra.run.vm03:> uname -m 2026-03-31T20:18:43.321 INFO:teuthology.orchestra.run.vm03.stdout:x86_64 2026-03-31T20:18:43.321 DEBUG:teuthology.orchestra.run.vm03:> cat /etc/os-release 2026-03-31T20:18:43.368 INFO:teuthology.orchestra.run.vm03.stdout:PRETTY_NAME="Ubuntu 22.04.5 LTS" 2026-03-31T20:18:43.368 INFO:teuthology.orchestra.run.vm03.stdout:NAME="Ubuntu" 2026-03-31T20:18:43.368 INFO:teuthology.orchestra.run.vm03.stdout:VERSION_ID="22.04" 2026-03-31T20:18:43.368 INFO:teuthology.orchestra.run.vm03.stdout:VERSION="22.04.5 LTS (Jammy Jellyfish)" 2026-03-31T20:18:43.368 INFO:teuthology.orchestra.run.vm03.stdout:VERSION_CODENAME=jammy 2026-03-31T20:18:43.368 INFO:teuthology.orchestra.run.vm03.stdout:ID=ubuntu 2026-03-31T20:18:43.368 INFO:teuthology.orchestra.run.vm03.stdout:ID_LIKE=debian 2026-03-31T20:18:43.368 INFO:teuthology.orchestra.run.vm03.stdout:HOME_URL="https://www.ubuntu.com/" 2026-03-31T20:18:43.368 INFO:teuthology.orchestra.run.vm03.stdout:SUPPORT_URL="https://help.ubuntu.com/" 2026-03-31T20:18:43.368 INFO:teuthology.orchestra.run.vm03.stdout:BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/" 2026-03-31T20:18:43.368 INFO:teuthology.orchestra.run.vm03.stdout:PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy" 2026-03-31T20:18:43.368 INFO:teuthology.orchestra.run.vm03.stdout:UBUNTU_CODENAME=jammy 2026-03-31T20:18:43.368 INFO:teuthology.lock.ops:Updating vm03.local on lock server 2026-03-31T20:18:43.388 INFO:teuthology.run_tasks:Running task internal.serialize_remote_roles... 2026-03-31T20:18:43.400 INFO:teuthology.run_tasks:Running task internal.check_conflict... 2026-03-31T20:18:43.401 INFO:teuthology.task.internal:Checking for old test directory... 2026-03-31T20:18:43.401 DEBUG:teuthology.orchestra.run.vm03:> test '!' -e /home/ubuntu/cephtest 2026-03-31T20:18:43.411 INFO:teuthology.run_tasks:Running task internal.check_ceph_data... 2026-03-31T20:18:43.412 INFO:teuthology.task.internal:Checking for non-empty /var/lib/ceph... 2026-03-31T20:18:43.412 DEBUG:teuthology.orchestra.run.vm03:> test -z $(ls -A /var/lib/ceph) 2026-03-31T20:18:43.456 INFO:teuthology.orchestra.run.vm03.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-31T20:18:43.456 INFO:teuthology.run_tasks:Running task internal.vm_setup... 2026-03-31T20:18:43.462 DEBUG:teuthology.orchestra.run.vm03:> test -e /ceph-qa-ready 2026-03-31T20:18:43.499 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-31T20:18:43.726 INFO:teuthology.run_tasks:Running task internal.base... 2026-03-31T20:18:43.728 INFO:teuthology.task.internal:Creating test directory... 2026-03-31T20:18:43.728 DEBUG:teuthology.orchestra.run.vm03:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-31T20:18:43.730 INFO:teuthology.run_tasks:Running task internal.archive_upload... 2026-03-31T20:18:43.731 INFO:teuthology.run_tasks:Running task internal.archive... 2026-03-31T20:18:43.732 INFO:teuthology.task.internal:Creating archive directory... 2026-03-31T20:18:43.732 DEBUG:teuthology.orchestra.run.vm03:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-31T20:18:43.777 INFO:teuthology.run_tasks:Running task internal.coredump... 2026-03-31T20:18:43.778 INFO:teuthology.task.internal:Enabling coredump saving... 2026-03-31T20:18:43.778 DEBUG:teuthology.orchestra.run.vm03:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-31T20:18:43.819 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-31T20:18:43.820 DEBUG:teuthology.orchestra.run.vm03:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-31T20:18:43.868 INFO:teuthology.orchestra.run.vm03.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T20:18:43.872 INFO:teuthology.orchestra.run.vm03.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-31T20:18:43.873 INFO:teuthology.run_tasks:Running task internal.sudo... 2026-03-31T20:18:43.874 INFO:teuthology.task.internal:Configuring sudo... 2026-03-31T20:18:43.874 DEBUG:teuthology.orchestra.run.vm03:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-31T20:18:43.920 INFO:teuthology.run_tasks:Running task internal.syslog... 2026-03-31T20:18:43.922 INFO:teuthology.task.internal.syslog:Starting syslog monitoring... 2026-03-31T20:18:43.923 DEBUG:teuthology.orchestra.run.vm03:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-31T20:18:43.964 DEBUG:teuthology.orchestra.run.vm03:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-31T20:18:44.008 DEBUG:teuthology.orchestra.run.vm03:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-31T20:18:44.052 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-31T20:18:44.052 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-31T20:18:44.101 DEBUG:teuthology.orchestra.run.vm03:> sudo service rsyslog restart 2026-03-31T20:18:44.157 INFO:teuthology.run_tasks:Running task internal.timer... 2026-03-31T20:18:44.188 INFO:teuthology.task.internal:Starting timer... 2026-03-31T20:18:44.188 INFO:teuthology.run_tasks:Running task pcp... 2026-03-31T20:18:44.195 INFO:teuthology.run_tasks:Running task selinux... 2026-03-31T20:18:44.203 INFO:teuthology.task.selinux:Excluding vm03: VMs are not yet supported 2026-03-31T20:18:44.203 DEBUG:teuthology.task.selinux:Getting current SELinux state 2026-03-31T20:18:44.203 DEBUG:teuthology.task.selinux:Existing SELinux modes: {} 2026-03-31T20:18:44.203 INFO:teuthology.task.selinux:Putting SELinux into permissive mode 2026-03-31T20:18:44.203 INFO:teuthology.run_tasks:Running task ansible.cephlab... 2026-03-31T20:18:44.208 DEBUG:teuthology.task:Applying overrides for task ansible.cephlab: {'branch': 'main', 'repo': 'https://github.com/kshtsk/ceph-cm-ansible.git', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'logical_volumes': {'lv_1': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_2': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_3': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_4': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}}, 'timezone': 'UTC', 'volume_groups': {'vg_nvme': {'pvs': '/dev/vdb,/dev/vdc,/dev/vdd,/dev/vde'}}}} 2026-03-31T20:18:44.209 DEBUG:teuthology.repo_utils:Resetting repo at /home/teuthos/src/github.com_kshtsk_ceph-cm-ansible_main to origin/main 2026-03-31T20:18:44.215 INFO:teuthology.task.ansible:Playbook: [{'import_playbook': 'ansible_managed.yml'}, {'import_playbook': 'teuthology.yml'}, {'hosts': 'testnodes', 'tasks': [{'set_fact': {'ran_from_cephlab_playbook': True}}]}, {'import_playbook': 'testnodes.yml'}, {'import_playbook': 'container-host.yml'}, {'import_playbook': 'cobbler.yml'}, {'import_playbook': 'paddles.yml'}, {'import_playbook': 'pulpito.yml'}, {'hosts': 'testnodes', 'become': True, 'tasks': [{'name': 'Touch /ceph-qa-ready', 'file': {'path': '/ceph-qa-ready', 'state': 'touch'}, 'when': 'ran_from_cephlab_playbook|bool'}]}] 2026-03-31T20:18:44.215 DEBUG:teuthology.task.ansible:Running ansible-playbook -v --extra-vars '{"ansible_ssh_user": "ubuntu", "logical_volumes": {"lv_1": {"scratch_dev": true, "size": "25%VG", "vg": "vg_nvme"}, "lv_2": {"scratch_dev": true, "size": "25%VG", "vg": "vg_nvme"}, "lv_3": {"scratch_dev": true, "size": "25%VG", "vg": "vg_nvme"}, "lv_4": {"scratch_dev": true, "size": "25%VG", "vg": "vg_nvme"}}, "timezone": "UTC", "volume_groups": {"vg_nvme": {"pvs": "/dev/vdb,/dev/vdc,/dev/vdd,/dev/vde"}}}' -i /tmp/teuth_ansible_inventoryc5fq3k79 --limit vm03.local /home/teuthos/src/github.com_kshtsk_ceph-cm-ansible_main/cephlab.yml --skip-tags nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs 2026-03-31T20:20:24.563 DEBUG:teuthology.task.ansible:Reconnecting to [Remote(name='ubuntu@vm03.local')] 2026-03-31T20:20:24.563 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm03.local' 2026-03-31T20:20:24.564 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm03.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-31T20:20:24.622 DEBUG:teuthology.orchestra.run.vm03:> true 2026-03-31T20:20:24.824 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm03.local' 2026-03-31T20:20:24.824 INFO:teuthology.run_tasks:Running task clock... 2026-03-31T20:20:24.827 INFO:teuthology.task.clock:Syncing clocks and checking initial clock skew... 2026-03-31T20:20:24.827 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-31T20:20:24.827 DEBUG:teuthology.orchestra.run.vm03:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-31T20:20:24.879 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:24 ntpd[16222]: ntpd 4.2.8p15@1.3728-o Wed Feb 16 17:13:02 UTC 2022 (1): Starting 2026-03-31T20:20:24.879 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:24 ntpd[16222]: Command line: ntpd -gq 2026-03-31T20:20:24.879 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:24 ntpd[16222]: ---------------------------------------------------- 2026-03-31T20:20:24.879 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:24 ntpd[16222]: ntp-4 is maintained by Network Time Foundation, 2026-03-31T20:20:24.879 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:24 ntpd[16222]: Inc. (NTF), a non-profit 501(c)(3) public-benefit 2026-03-31T20:20:24.879 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:24 ntpd[16222]: corporation. Support and training for ntp-4 are 2026-03-31T20:20:24.879 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:24 ntpd[16222]: available at https://www.nwtime.org/support 2026-03-31T20:20:24.879 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:24 ntpd[16222]: ---------------------------------------------------- 2026-03-31T20:20:24.879 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:24 ntpd[16222]: proto: precision = 0.029 usec (-25) 2026-03-31T20:20:24.879 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:24 ntpd[16222]: basedate set to 2022-02-04 2026-03-31T20:20:24.879 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:24 ntpd[16222]: gps base set to 2022-02-06 (week 2196) 2026-03-31T20:20:24.879 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:24 ntpd[16222]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): good hash signature 2026-03-31T20:20:24.879 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:24 ntpd[16222]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): loaded, expire=2025-12-28T00:00:00Z last=2017-01-01T00:00:00Z ofs=37 2026-03-31T20:20:24.879 INFO:teuthology.orchestra.run.vm03.stderr:31 Mar 20:20:24 ntpd[16222]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): expired 94 days ago 2026-03-31T20:20:24.880 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:24 ntpd[16222]: Listen and drop on 0 v6wildcard [::]:123 2026-03-31T20:20:24.880 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:24 ntpd[16222]: Listen and drop on 1 v4wildcard 0.0.0.0:123 2026-03-31T20:20:24.880 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:24 ntpd[16222]: Listen normally on 2 lo 127.0.0.1:123 2026-03-31T20:20:24.880 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:24 ntpd[16222]: Listen normally on 3 ens3 192.168.123.103:123 2026-03-31T20:20:24.880 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:24 ntpd[16222]: Listen normally on 4 lo [::1]:123 2026-03-31T20:20:24.880 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:24 ntpd[16222]: Listen normally on 5 ens3 [fe80::5055:ff:fe00:3%2]:123 2026-03-31T20:20:24.880 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:24 ntpd[16222]: Listening on routing socket on fd #22 for interface updates 2026-03-31T20:20:25.880 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:25 ntpd[16222]: Soliciting pool server 139.162.152.20 2026-03-31T20:20:26.879 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:26 ntpd[16222]: Soliciting pool server 139.144.71.56 2026-03-31T20:20:26.879 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:26 ntpd[16222]: Soliciting pool server 78.46.87.46 2026-03-31T20:20:27.879 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:27 ntpd[16222]: Soliciting pool server 141.84.43.73 2026-03-31T20:20:27.879 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:27 ntpd[16222]: Soliciting pool server 46.21.2.169 2026-03-31T20:20:27.879 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:27 ntpd[16222]: Soliciting pool server 212.132.97.26 2026-03-31T20:20:28.879 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:28 ntpd[16222]: Soliciting pool server 144.91.126.59 2026-03-31T20:20:28.879 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:28 ntpd[16222]: Soliciting pool server 81.3.27.46 2026-03-31T20:20:28.879 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:28 ntpd[16222]: Soliciting pool server 172.236.195.26 2026-03-31T20:20:28.880 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:28 ntpd[16222]: Soliciting pool server 195.201.63.235 2026-03-31T20:20:29.879 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:29 ntpd[16222]: Soliciting pool server 172.104.134.72 2026-03-31T20:20:29.880 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:29 ntpd[16222]: Soliciting pool server 79.133.44.136 2026-03-31T20:20:29.880 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:29 ntpd[16222]: Soliciting pool server 5.75.181.179 2026-03-31T20:20:29.880 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:29 ntpd[16222]: Soliciting pool server 185.125.190.56 2026-03-31T20:20:30.879 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:30 ntpd[16222]: Soliciting pool server 91.189.91.157 2026-03-31T20:20:30.880 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:30 ntpd[16222]: Soliciting pool server 46.224.156.215 2026-03-31T20:20:30.880 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:30 ntpd[16222]: Soliciting pool server 85.215.189.120 2026-03-31T20:20:34.902 INFO:teuthology.orchestra.run.vm03.stdout:31 Mar 20:20:34 ntpd[16222]: ntpd: time slew -0.004147 s 2026-03-31T20:20:34.902 INFO:teuthology.orchestra.run.vm03.stdout:ntpd: time slew -0.004147s 2026-03-31T20:20:34.921 INFO:teuthology.orchestra.run.vm03.stdout: remote refid st t when poll reach delay offset jitter 2026-03-31T20:20:34.921 INFO:teuthology.orchestra.run.vm03.stdout:============================================================================== 2026-03-31T20:20:34.921 INFO:teuthology.orchestra.run.vm03.stdout: 0.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-31T20:20:34.921 INFO:teuthology.orchestra.run.vm03.stdout: 1.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-31T20:20:34.921 INFO:teuthology.orchestra.run.vm03.stdout: 2.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-31T20:20:34.921 INFO:teuthology.orchestra.run.vm03.stdout: 3.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-31T20:20:34.921 INFO:teuthology.orchestra.run.vm03.stdout: ntp.ubuntu.com .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-31T20:20:34.922 INFO:teuthology.run_tasks:Running task install... 2026-03-31T20:20:34.924 DEBUG:teuthology.task.install:project ceph 2026-03-31T20:20:34.924 DEBUG:teuthology.task.install:INSTALL overrides: {'ceph': {'flavor': 'default', 'sha1': '5bb3278730741031382ca9c3dc9d221a942e06a2'}, 'extra_system_packages': {'deb': ['python3-jmespath', 'python3-xmltodict', 's3cmd'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-jmespath', 'python3-xmltodict', 's3cmd']}} 2026-03-31T20:20:34.924 DEBUG:teuthology.task.install:config {'flavor': 'default', 'sha1': '5bb3278730741031382ca9c3dc9d221a942e06a2', 'extra_system_packages': {'deb': ['python3-jmespath', 'python3-xmltodict', 's3cmd'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-jmespath', 'python3-xmltodict', 's3cmd']}} 2026-03-31T20:20:34.924 INFO:teuthology.task.install:Using flavor: default 2026-03-31T20:20:34.926 DEBUG:teuthology.task.install:Package list is: {'deb': ['ceph', 'cephadm', 'ceph-mds', 'ceph-mgr', 'ceph-common', 'ceph-fuse', 'ceph-test', 'ceph-volume', 'radosgw', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'libcephfs2', 'libcephfs-dev', 'librados2', 'librbd1', 'rbd-fuse'], 'rpm': ['ceph-radosgw', 'ceph-test', 'ceph', 'ceph-base', 'cephadm', 'ceph-immutable-object-cache', 'ceph-mgr', 'ceph-mgr-dashboard', 'ceph-mgr-diskprediction-local', 'ceph-mgr-rook', 'ceph-mgr-cephadm', 'ceph-fuse', 'ceph-volume', 'librados-devel', 'libcephfs2', 'libcephfs-devel', 'librados2', 'librbd1', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'rbd-fuse', 'rbd-mirror', 'rbd-nbd']} 2026-03-31T20:20:34.926 INFO:teuthology.task.install:extra packages: [] 2026-03-31T20:20:34.926 DEBUG:teuthology.orchestra.run.vm03:> sudo apt-key list | grep Ceph 2026-03-31T20:20:35.002 INFO:teuthology.orchestra.run.vm03.stderr:Warning: apt-key is deprecated. Manage keyring files in trusted.gpg.d instead (see apt-key(8)). 2026-03-31T20:20:35.020 INFO:teuthology.orchestra.run.vm03.stdout:uid [ unknown] Ceph automated package build (Ceph automated package build) 2026-03-31T20:20:35.020 INFO:teuthology.orchestra.run.vm03.stdout:uid [ unknown] Ceph.com (release key) 2026-03-31T20:20:35.020 INFO:teuthology.task.install.deb:Installing packages: ceph, cephadm, ceph-mds, ceph-mgr, ceph-common, ceph-fuse, ceph-test, ceph-volume, radosgw, python3-rados, python3-rgw, python3-cephfs, python3-rbd, libcephfs2, libcephfs-dev, librados2, librbd1, rbd-fuse on remote deb x86_64 2026-03-31T20:20:35.020 INFO:teuthology.task.install.deb:Installing system (non-project) packages: python3-jmespath, python3-xmltodict, s3cmd on remote deb x86_64 2026-03-31T20:20:35.020 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&sha1=5bb3278730741031382ca9c3dc9d221a942e06a2 2026-03-31T20:20:35.599 INFO:teuthology.task.install.deb:Pulling from https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default/ 2026-03-31T20:20:35.599 INFO:teuthology.task.install.deb:Package version is 20.2.0-721-g5bb32787-1jammy 2026-03-31T20:20:36.099 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-31T20:20:36.099 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/etc/apt/sources.list.d/ceph.list 2026-03-31T20:20:36.107 DEBUG:teuthology.orchestra.run.vm03:> sudo apt-get update 2026-03-31T20:20:36.273 INFO:teuthology.orchestra.run.vm03.stdout:Hit:1 http://security.ubuntu.com/ubuntu jammy-security InRelease 2026-03-31T20:20:36.415 INFO:teuthology.orchestra.run.vm03.stdout:Hit:2 http://archive.ubuntu.com/ubuntu jammy InRelease 2026-03-31T20:20:36.525 INFO:teuthology.orchestra.run.vm03.stdout:Hit:3 http://archive.ubuntu.com/ubuntu jammy-updates InRelease 2026-03-31T20:20:36.634 INFO:teuthology.orchestra.run.vm03.stdout:Hit:4 http://archive.ubuntu.com/ubuntu jammy-backports InRelease 2026-03-31T20:20:36.645 INFO:teuthology.orchestra.run.vm03.stdout:Ign:5 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy InRelease 2026-03-31T20:20:36.757 INFO:teuthology.orchestra.run.vm03.stdout:Get:6 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy Release [7689 B] 2026-03-31T20:20:36.868 INFO:teuthology.orchestra.run.vm03.stdout:Ign:7 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy Release.gpg 2026-03-31T20:20:36.979 INFO:teuthology.orchestra.run.vm03.stdout:Get:8 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 Packages [18.8 kB] 2026-03-31T20:20:37.052 INFO:teuthology.orchestra.run.vm03.stdout:Fetched 26.5 kB in 1s (33.2 kB/s) 2026-03-31T20:20:37.634 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-31T20:20:37.646 DEBUG:teuthology.orchestra.run.vm03:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" install ceph=20.2.0-721-g5bb32787-1jammy cephadm=20.2.0-721-g5bb32787-1jammy ceph-mds=20.2.0-721-g5bb32787-1jammy ceph-mgr=20.2.0-721-g5bb32787-1jammy ceph-common=20.2.0-721-g5bb32787-1jammy ceph-fuse=20.2.0-721-g5bb32787-1jammy ceph-test=20.2.0-721-g5bb32787-1jammy ceph-volume=20.2.0-721-g5bb32787-1jammy radosgw=20.2.0-721-g5bb32787-1jammy python3-rados=20.2.0-721-g5bb32787-1jammy python3-rgw=20.2.0-721-g5bb32787-1jammy python3-cephfs=20.2.0-721-g5bb32787-1jammy python3-rbd=20.2.0-721-g5bb32787-1jammy libcephfs2=20.2.0-721-g5bb32787-1jammy libcephfs-dev=20.2.0-721-g5bb32787-1jammy librados2=20.2.0-721-g5bb32787-1jammy librbd1=20.2.0-721-g5bb32787-1jammy rbd-fuse=20.2.0-721-g5bb32787-1jammy 2026-03-31T20:20:37.677 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-31T20:20:37.817 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-31T20:20:37.817 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-31T20:20:37.921 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-31T20:20:37.921 INFO:teuthology.orchestra.run.vm03.stdout: kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-31T20:20:37.921 INFO:teuthology.orchestra.run.vm03.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-31T20:20:37.921 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-31T20:20:37.921 INFO:teuthology.orchestra.run.vm03.stdout:The following additional packages will be installed: 2026-03-31T20:20:37.921 INFO:teuthology.orchestra.run.vm03.stdout: ceph-base ceph-mgr-cephadm ceph-mgr-dashboard ceph-mgr-diskprediction-local 2026-03-31T20:20:37.921 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-k8sevents ceph-mgr-modules-core ceph-mon ceph-osd jq 2026-03-31T20:20:37.921 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs-daemon libcephfs-proxy2 libdouble-conversion3 libfuse2 libjq1 2026-03-31T20:20:37.921 INFO:teuthology.orchestra.run.vm03.stdout: liblttng-ust1 libnbd0 liboath0 libonig5 libpcre2-16-0 libqt5core5a 2026-03-31T20:20:37.921 INFO:teuthology.orchestra.run.vm03.stdout: libqt5dbus5 libqt5network5 libradosstriper1 librdkafka1 librgw2 2026-03-31T20:20:37.922 INFO:teuthology.orchestra.run.vm03.stdout: libsqlite3-mod-ceph libthrift-0.16.0 nvme-cli python-asyncssh-doc 2026-03-31T20:20:37.922 INFO:teuthology.orchestra.run.vm03.stdout: python3-asyncssh python3-cachetools python3-ceph-argparse 2026-03-31T20:20:37.922 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-31T20:20:37.922 INFO:teuthology.orchestra.run.vm03.stdout: python3-iniconfig python3-jaraco.classes python3-jaraco.collections 2026-03-31T20:20:37.922 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-31T20:20:37.922 INFO:teuthology.orchestra.run.vm03.stdout: python3-kubernetes python3-natsort python3-pluggy python3-portend 2026-03-31T20:20:37.922 INFO:teuthology.orchestra.run.vm03.stdout: python3-prettytable python3-psutil python3-py python3-pygments 2026-03-31T20:20:37.922 INFO:teuthology.orchestra.run.vm03.stdout: python3-pytest python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-31T20:20:37.922 INFO:teuthology.orchestra.run.vm03.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-31T20:20:37.922 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora python3-threadpoolctl python3-toml python3-wcwidth 2026-03-31T20:20:37.922 INFO:teuthology.orchestra.run.vm03.stdout: python3-webob python3-websocket python3-zc.lockfile qttranslations5-l10n 2026-03-31T20:20:37.922 INFO:teuthology.orchestra.run.vm03.stdout: smartmontools socat xmlstarlet 2026-03-31T20:20:37.922 INFO:teuthology.orchestra.run.vm03.stdout:Suggested packages: 2026-03-31T20:20:37.922 INFO:teuthology.orchestra.run.vm03.stdout: python3-influxdb liblua5.3-dev luarocks python-natsort-doc python-psutil-doc 2026-03-31T20:20:37.922 INFO:teuthology.orchestra.run.vm03.stdout: subversion python-pygments-doc ttf-bitstream-vera python3-paste python3-dap 2026-03-31T20:20:37.922 INFO:teuthology.orchestra.run.vm03.stdout: python-sklearn-doc ipython3 python-webob-doc gsmartcontrol smart-notifier 2026-03-31T20:20:37.922 INFO:teuthology.orchestra.run.vm03.stdout: mailx | mailutils 2026-03-31T20:20:37.922 INFO:teuthology.orchestra.run.vm03.stdout:Recommended packages: 2026-03-31T20:20:37.922 INFO:teuthology.orchestra.run.vm03.stdout: btrfs-tools 2026-03-31T20:20:37.959 INFO:teuthology.orchestra.run.vm03.stdout:The following NEW packages will be installed: 2026-03-31T20:20:37.959 INFO:teuthology.orchestra.run.vm03.stdout: ceph ceph-base ceph-common ceph-fuse ceph-mds ceph-mgr ceph-mgr-cephadm 2026-03-31T20:20:37.959 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-k8sevents 2026-03-31T20:20:37.959 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core ceph-mon ceph-osd ceph-test ceph-volume cephadm jq 2026-03-31T20:20:37.959 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs-daemon libcephfs-dev libcephfs-proxy2 libcephfs2 2026-03-31T20:20:37.959 INFO:teuthology.orchestra.run.vm03.stdout: libdouble-conversion3 libfuse2 libjq1 liblttng-ust1 libnbd0 liboath0 2026-03-31T20:20:37.959 INFO:teuthology.orchestra.run.vm03.stdout: libonig5 libpcre2-16-0 libqt5core5a libqt5dbus5 libqt5network5 2026-03-31T20:20:37.959 INFO:teuthology.orchestra.run.vm03.stdout: libradosstriper1 librdkafka1 librgw2 libsqlite3-mod-ceph libthrift-0.16.0 2026-03-31T20:20:37.959 INFO:teuthology.orchestra.run.vm03.stdout: nvme-cli python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-31T20:20:37.959 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-argparse python3-ceph-common python3-cephfs python3-cheroot 2026-03-31T20:20:37.959 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy3 python3-google-auth python3-iniconfig 2026-03-31T20:20:37.959 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-31T20:20:37.959 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-natsort 2026-03-31T20:20:37.959 INFO:teuthology.orchestra.run.vm03.stdout: python3-pluggy python3-portend python3-prettytable python3-psutil python3-py 2026-03-31T20:20:37.959 INFO:teuthology.orchestra.run.vm03.stdout: python3-pygments python3-pytest python3-rados python3-rbd python3-repoze.lru 2026-03-31T20:20:37.959 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-oauthlib python3-rgw python3-routes python3-rsa 2026-03-31T20:20:37.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-simplejson python3-sklearn python3-sklearn-lib python3-tempora 2026-03-31T20:20:37.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-threadpoolctl python3-toml python3-wcwidth python3-webob 2026-03-31T20:20:37.960 INFO:teuthology.orchestra.run.vm03.stdout: python3-websocket python3-zc.lockfile qttranslations5-l10n radosgw rbd-fuse 2026-03-31T20:20:37.960 INFO:teuthology.orchestra.run.vm03.stdout: smartmontools socat xmlstarlet 2026-03-31T20:20:37.960 INFO:teuthology.orchestra.run.vm03.stdout:The following packages will be upgraded: 2026-03-31T20:20:37.960 INFO:teuthology.orchestra.run.vm03.stdout: librados2 librbd1 2026-03-31T20:20:37.987 INFO:teuthology.orchestra.run.vm03.stdout:2 upgraded, 85 newly installed, 0 to remove and 50 not upgraded. 2026-03-31T20:20:37.987 INFO:teuthology.orchestra.run.vm03.stdout:Need to get 281 MB of archives. 2026-03-31T20:20:37.987 INFO:teuthology.orchestra.run.vm03.stdout:After this operation, 1092 MB of additional disk space will be used. 2026-03-31T20:20:37.987 INFO:teuthology.orchestra.run.vm03.stdout:Get:1 http://archive.ubuntu.com/ubuntu jammy/main amd64 liblttng-ust1 amd64 2.13.1-1ubuntu1 [190 kB] 2026-03-31T20:20:38.025 INFO:teuthology.orchestra.run.vm03.stdout:Get:2 http://archive.ubuntu.com/ubuntu jammy/universe amd64 libdouble-conversion3 amd64 3.1.7-4 [39.0 kB] 2026-03-31T20:20:38.026 INFO:teuthology.orchestra.run.vm03.stdout:Get:3 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 libpcre2-16-0 amd64 10.39-3ubuntu0.1 [203 kB] 2026-03-31T20:20:38.034 INFO:teuthology.orchestra.run.vm03.stdout:Get:4 http://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5core5a amd64 5.15.3+dfsg-2ubuntu0.2 [2006 kB] 2026-03-31T20:20:38.080 INFO:teuthology.orchestra.run.vm03.stdout:Get:5 http://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5dbus5 amd64 5.15.3+dfsg-2ubuntu0.2 [222 kB] 2026-03-31T20:20:38.081 INFO:teuthology.orchestra.run.vm03.stdout:Get:6 http://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5network5 amd64 5.15.3+dfsg-2ubuntu0.2 [731 kB] 2026-03-31T20:20:38.084 INFO:teuthology.orchestra.run.vm03.stdout:Get:7 http://archive.ubuntu.com/ubuntu jammy/universe amd64 libthrift-0.16.0 amd64 0.16.0-2 [267 kB] 2026-03-31T20:20:38.085 INFO:teuthology.orchestra.run.vm03.stdout:Get:8 http://archive.ubuntu.com/ubuntu jammy/universe amd64 libnbd0 amd64 1.10.5-1 [71.3 kB] 2026-03-31T20:20:38.086 INFO:teuthology.orchestra.run.vm03.stdout:Get:9 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-wcwidth all 0.2.5+dfsg1-1 [21.9 kB] 2026-03-31T20:20:38.086 INFO:teuthology.orchestra.run.vm03.stdout:Get:10 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-prettytable all 2.5.0-2 [31.3 kB] 2026-03-31T20:20:38.086 INFO:teuthology.orchestra.run.vm03.stdout:Get:11 http://archive.ubuntu.com/ubuntu jammy/universe amd64 librdkafka1 amd64 1.8.0-1build1 [633 kB] 2026-03-31T20:20:38.088 INFO:teuthology.orchestra.run.vm03.stdout:Get:12 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 liboath0 amd64 2.6.7-3ubuntu0.1 [41.3 kB] 2026-03-31T20:20:38.089 INFO:teuthology.orchestra.run.vm03.stdout:Get:13 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.functools all 3.4.0-2 [9030 B] 2026-03-31T20:20:38.089 INFO:teuthology.orchestra.run.vm03.stdout:Get:14 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-cheroot all 8.5.2+ds1-1ubuntu3.2 [72.1 kB] 2026-03-31T20:20:38.096 INFO:teuthology.orchestra.run.vm03.stdout:Get:15 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.classes all 3.2.1-3 [6452 B] 2026-03-31T20:20:38.096 INFO:teuthology.orchestra.run.vm03.stdout:Get:16 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.text all 3.6.0-2 [8716 B] 2026-03-31T20:20:38.096 INFO:teuthology.orchestra.run.vm03.stdout:Get:17 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.collections all 3.4.0-2 [11.4 kB] 2026-03-31T20:20:38.096 INFO:teuthology.orchestra.run.vm03.stdout:Get:18 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-tempora all 4.1.2-1 [14.8 kB] 2026-03-31T20:20:38.097 INFO:teuthology.orchestra.run.vm03.stdout:Get:19 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-portend all 3.0.0-1 [7240 B] 2026-03-31T20:20:38.097 INFO:teuthology.orchestra.run.vm03.stdout:Get:20 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-zc.lockfile all 2.0-1 [8980 B] 2026-03-31T20:20:38.104 INFO:teuthology.orchestra.run.vm03.stdout:Get:21 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-cherrypy3 all 18.6.1-4 [208 kB] 2026-03-31T20:20:38.106 INFO:teuthology.orchestra.run.vm03.stdout:Get:22 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-natsort all 8.0.2-1 [35.3 kB] 2026-03-31T20:20:38.106 INFO:teuthology.orchestra.run.vm03.stdout:Get:23 http://archive.ubuntu.com/ubuntu jammy/universe amd64 libfuse2 amd64 2.9.9-5ubuntu3 [90.3 kB] 2026-03-31T20:20:38.108 INFO:teuthology.orchestra.run.vm03.stdout:Get:24 http://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 python3-asyncssh all 2.5.0-1ubuntu0.1 [189 kB] 2026-03-31T20:20:38.112 INFO:teuthology.orchestra.run.vm03.stdout:Get:25 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-repoze.lru all 0.7-2 [12.1 kB] 2026-03-31T20:20:38.112 INFO:teuthology.orchestra.run.vm03.stdout:Get:26 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-routes all 2.5.1-1ubuntu1 [89.0 kB] 2026-03-31T20:20:38.113 INFO:teuthology.orchestra.run.vm03.stdout:Get:27 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-sklearn-lib amd64 0.23.2-5ubuntu6 [2058 kB] 2026-03-31T20:20:38.158 INFO:teuthology.orchestra.run.vm03.stdout:Get:28 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-joblib all 0.17.0-4ubuntu1 [204 kB] 2026-03-31T20:20:38.159 INFO:teuthology.orchestra.run.vm03.stdout:Get:29 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-threadpoolctl all 3.1.0-1 [21.3 kB] 2026-03-31T20:20:38.159 INFO:teuthology.orchestra.run.vm03.stdout:Get:30 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-sklearn all 0.23.2-5ubuntu6 [1829 kB] 2026-03-31T20:20:38.166 INFO:teuthology.orchestra.run.vm03.stdout:Get:31 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-cachetools all 5.0.0-1 [9722 B] 2026-03-31T20:20:38.166 INFO:teuthology.orchestra.run.vm03.stdout:Get:32 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-rsa all 4.8-1 [28.4 kB] 2026-03-31T20:20:38.166 INFO:teuthology.orchestra.run.vm03.stdout:Get:33 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-google-auth all 1.5.1-3 [35.7 kB] 2026-03-31T20:20:38.166 INFO:teuthology.orchestra.run.vm03.stdout:Get:34 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-requests-oauthlib all 1.3.0+ds-0.1 [18.7 kB] 2026-03-31T20:20:38.167 INFO:teuthology.orchestra.run.vm03.stdout:Get:35 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-websocket all 1.2.3-1 [34.7 kB] 2026-03-31T20:20:38.167 INFO:teuthology.orchestra.run.vm03.stdout:Get:36 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-kubernetes all 12.0.1-1ubuntu1 [353 kB] 2026-03-31T20:20:38.172 INFO:teuthology.orchestra.run.vm03.stdout:Get:37 http://archive.ubuntu.com/ubuntu jammy/main amd64 libonig5 amd64 6.9.7.1-2build1 [172 kB] 2026-03-31T20:20:38.174 INFO:teuthology.orchestra.run.vm03.stdout:Get:38 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 libjq1 amd64 1.6-2.1ubuntu3.1 [133 kB] 2026-03-31T20:20:38.176 INFO:teuthology.orchestra.run.vm03.stdout:Get:39 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 jq amd64 1.6-2.1ubuntu3.1 [52.5 kB] 2026-03-31T20:20:38.182 INFO:teuthology.orchestra.run.vm03.stdout:Get:40 http://archive.ubuntu.com/ubuntu jammy/main amd64 socat amd64 1.7.4.1-3ubuntu4 [349 kB] 2026-03-31T20:20:38.185 INFO:teuthology.orchestra.run.vm03.stdout:Get:41 http://archive.ubuntu.com/ubuntu jammy/universe amd64 xmlstarlet amd64 1.6.1-2.1 [265 kB] 2026-03-31T20:20:38.187 INFO:teuthology.orchestra.run.vm03.stdout:Get:42 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 nvme-cli amd64 1.16-3ubuntu0.3 [474 kB] 2026-03-31T20:20:38.192 INFO:teuthology.orchestra.run.vm03.stdout:Get:43 http://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 python-asyncssh-doc all 2.5.0-1ubuntu0.1 [309 kB] 2026-03-31T20:20:38.194 INFO:teuthology.orchestra.run.vm03.stdout:Get:44 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-iniconfig all 1.1.1-2 [6024 B] 2026-03-31T20:20:38.194 INFO:teuthology.orchestra.run.vm03.stdout:Get:45 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-pluggy all 0.13.0-7.1 [19.0 kB] 2026-03-31T20:20:38.194 INFO:teuthology.orchestra.run.vm03.stdout:Get:46 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-psutil amd64 5.9.0-1build1 [158 kB] 2026-03-31T20:20:38.196 INFO:teuthology.orchestra.run.vm03.stdout:Get:47 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-py all 1.10.0-1 [71.9 kB] 2026-03-31T20:20:38.196 INFO:teuthology.orchestra.run.vm03.stdout:Get:48 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-pygments all 2.11.2+dfsg-2ubuntu0.1 [750 kB] 2026-03-31T20:20:38.204 INFO:teuthology.orchestra.run.vm03.stdout:Get:49 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-toml all 0.10.2-1 [16.5 kB] 2026-03-31T20:20:38.204 INFO:teuthology.orchestra.run.vm03.stdout:Get:50 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-pytest all 6.2.5-1ubuntu2 [214 kB] 2026-03-31T20:20:38.206 INFO:teuthology.orchestra.run.vm03.stdout:Get:51 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-simplejson amd64 3.17.6-1build1 [54.7 kB] 2026-03-31T20:20:38.211 INFO:teuthology.orchestra.run.vm03.stdout:Get:52 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-webob all 1:1.8.6-1.1ubuntu0.1 [86.7 kB] 2026-03-31T20:20:38.212 INFO:teuthology.orchestra.run.vm03.stdout:Get:53 http://archive.ubuntu.com/ubuntu jammy/universe amd64 qttranslations5-l10n all 5.15.3-1 [1983 kB] 2026-03-31T20:20:38.244 INFO:teuthology.orchestra.run.vm03.stdout:Get:54 http://archive.ubuntu.com/ubuntu jammy-updates/main amd64 smartmontools amd64 7.2-1ubuntu0.1 [583 kB] 2026-03-31T20:20:38.546 INFO:teuthology.orchestra.run.vm03.stdout:Get:55 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 librbd1 amd64 20.2.0-721-g5bb32787-1jammy [2867 kB] 2026-03-31T20:20:39.502 INFO:teuthology.orchestra.run.vm03.stdout:Get:56 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 librados2 amd64 20.2.0-721-g5bb32787-1jammy [3571 kB] 2026-03-31T20:20:39.850 INFO:teuthology.orchestra.run.vm03.stdout:Get:57 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 libcephfs2 amd64 20.2.0-721-g5bb32787-1jammy [831 kB] 2026-03-31T20:20:39.960 INFO:teuthology.orchestra.run.vm03.stdout:Get:58 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 python3-rados amd64 20.2.0-721-g5bb32787-1jammy [364 kB] 2026-03-31T20:20:39.965 INFO:teuthology.orchestra.run.vm03.stdout:Get:59 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 python3-ceph-argparse all 20.2.0-721-g5bb32787-1jammy [32.9 kB] 2026-03-31T20:20:39.965 INFO:teuthology.orchestra.run.vm03.stdout:Get:60 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 python3-cephfs amd64 20.2.0-721-g5bb32787-1jammy [184 kB] 2026-03-31T20:20:39.968 INFO:teuthology.orchestra.run.vm03.stdout:Get:61 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 python3-ceph-common all 20.2.0-721-g5bb32787-1jammy [83.9 kB] 2026-03-31T20:20:40.067 INFO:teuthology.orchestra.run.vm03.stdout:Get:62 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 python3-rbd amd64 20.2.0-721-g5bb32787-1jammy [341 kB] 2026-03-31T20:20:40.072 INFO:teuthology.orchestra.run.vm03.stdout:Get:63 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 librgw2 amd64 20.2.0-721-g5bb32787-1jammy [8696 kB] 2026-03-31T20:20:40.759 INFO:teuthology.orchestra.run.vm03.stdout:Get:64 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 python3-rgw amd64 20.2.0-721-g5bb32787-1jammy [112 kB] 2026-03-31T20:20:40.760 INFO:teuthology.orchestra.run.vm03.stdout:Get:65 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 libradosstriper1 amd64 20.2.0-721-g5bb32787-1jammy [261 kB] 2026-03-31T20:20:40.768 INFO:teuthology.orchestra.run.vm03.stdout:Get:66 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 ceph-common amd64 20.2.0-721-g5bb32787-1jammy [29.3 MB] 2026-03-31T20:20:42.823 INFO:teuthology.orchestra.run.vm03.stdout:Get:67 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 ceph-base amd64 20.2.0-721-g5bb32787-1jammy [5416 kB] 2026-03-31T20:20:43.165 INFO:teuthology.orchestra.run.vm03.stdout:Get:68 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-modules-core all 20.2.0-721-g5bb32787-1jammy [246 kB] 2026-03-31T20:20:43.166 INFO:teuthology.orchestra.run.vm03.stdout:Get:69 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 libsqlite3-mod-ceph amd64 20.2.0-721-g5bb32787-1jammy [124 kB] 2026-03-31T20:20:43.168 INFO:teuthology.orchestra.run.vm03.stdout:Get:70 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr amd64 20.2.0-721-g5bb32787-1jammy [907 kB] 2026-03-31T20:20:43.235 INFO:teuthology.orchestra.run.vm03.stdout:Get:71 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mon amd64 20.2.0-721-g5bb32787-1jammy [6393 kB] 2026-03-31T20:20:43.680 INFO:teuthology.orchestra.run.vm03.stdout:Get:72 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 ceph-osd amd64 20.2.0-721-g5bb32787-1jammy [21.7 MB] 2026-03-31T20:20:45.101 INFO:teuthology.orchestra.run.vm03.stdout:Get:73 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 ceph amd64 20.2.0-721-g5bb32787-1jammy [14.1 kB] 2026-03-31T20:20:45.101 INFO:teuthology.orchestra.run.vm03.stdout:Get:74 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 ceph-fuse amd64 20.2.0-721-g5bb32787-1jammy [955 kB] 2026-03-31T20:20:45.195 INFO:teuthology.orchestra.run.vm03.stdout:Get:75 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mds amd64 20.2.0-721-g5bb32787-1jammy [2341 kB] 2026-03-31T20:20:45.327 INFO:teuthology.orchestra.run.vm03.stdout:Get:76 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 cephadm amd64 20.2.0-721-g5bb32787-1jammy [1049 kB] 2026-03-31T20:20:45.420 INFO:teuthology.orchestra.run.vm03.stdout:Get:77 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-cephadm all 20.2.0-721-g5bb32787-1jammy [179 kB] 2026-03-31T20:20:45.429 INFO:teuthology.orchestra.run.vm03.stdout:Get:78 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-dashboard all 20.2.0-721-g5bb32787-1jammy [45.5 MB] 2026-03-31T20:20:48.400 INFO:teuthology.orchestra.run.vm03.stdout:Get:79 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-diskprediction-local all 20.2.0-721-g5bb32787-1jammy [8625 kB] 2026-03-31T20:20:48.946 INFO:teuthology.orchestra.run.vm03.stdout:Get:80 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-k8sevents all 20.2.0-721-g5bb32787-1jammy [14.2 kB] 2026-03-31T20:20:48.946 INFO:teuthology.orchestra.run.vm03.stdout:Get:81 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 ceph-test amd64 20.2.0-721-g5bb32787-1jammy [99.5 MB] 2026-03-31T20:20:54.969 INFO:teuthology.orchestra.run.vm03.stdout:Get:82 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 ceph-volume all 20.2.0-721-g5bb32787-1jammy [135 kB] 2026-03-31T20:20:54.969 INFO:teuthology.orchestra.run.vm03.stdout:Get:83 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 libcephfs-daemon amd64 20.2.0-721-g5bb32787-1jammy [43.2 kB] 2026-03-31T20:20:54.969 INFO:teuthology.orchestra.run.vm03.stdout:Get:84 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 libcephfs-proxy2 amd64 20.2.0-721-g5bb32787-1jammy [30.7 kB] 2026-03-31T20:20:54.970 INFO:teuthology.orchestra.run.vm03.stdout:Get:85 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 libcephfs-dev amd64 20.2.0-721-g5bb32787-1jammy [41.4 kB] 2026-03-31T20:20:54.970 INFO:teuthology.orchestra.run.vm03.stdout:Get:86 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 radosgw amd64 20.2.0-721-g5bb32787-1jammy [25.1 MB] 2026-03-31T20:20:56.177 INFO:teuthology.orchestra.run.vm03.stdout:Get:87 https://3.chacra.ceph.com/r/ceph/tentacle-release/5bb3278730741031382ca9c3dc9d221a942e06a2/ubuntu/jammy/flavors/default jammy/main amd64 rbd-fuse amd64 20.2.0-721-g5bb32787-1jammy [97.5 kB] 2026-03-31T20:20:56.422 INFO:teuthology.orchestra.run.vm03.stdout:Fetched 281 MB in 18s (15.4 MB/s) 2026-03-31T20:20:56.542 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package liblttng-ust1:amd64. 2026-03-31T20:20:56.577 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 119262 files and directories currently installed.) 2026-03-31T20:20:56.580 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../00-liblttng-ust1_2.13.1-1ubuntu1_amd64.deb ... 2026-03-31T20:20:56.581 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-31T20:20:56.602 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libdouble-conversion3:amd64. 2026-03-31T20:20:56.608 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../01-libdouble-conversion3_3.1.7-4_amd64.deb ... 2026-03-31T20:20:56.609 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-31T20:20:56.626 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libpcre2-16-0:amd64. 2026-03-31T20:20:56.634 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../02-libpcre2-16-0_10.39-3ubuntu0.1_amd64.deb ... 2026-03-31T20:20:56.635 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-31T20:20:56.657 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libqt5core5a:amd64. 2026-03-31T20:20:56.664 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../03-libqt5core5a_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-31T20:20:56.669 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-31T20:20:56.712 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libqt5dbus5:amd64. 2026-03-31T20:20:56.720 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../04-libqt5dbus5_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-31T20:20:56.721 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-31T20:20:56.745 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libqt5network5:amd64. 2026-03-31T20:20:56.752 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../05-libqt5network5_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-31T20:20:56.753 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-31T20:20:56.777 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libthrift-0.16.0:amd64. 2026-03-31T20:20:56.784 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../06-libthrift-0.16.0_0.16.0-2_amd64.deb ... 2026-03-31T20:20:56.794 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-31T20:20:56.820 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../07-librbd1_20.2.0-721-g5bb32787-1jammy_amd64.deb ... 2026-03-31T20:20:56.823 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking librbd1 (20.2.0-721-g5bb32787-1jammy) over (17.2.9-0ubuntu0.22.04.2) ... 2026-03-31T20:20:56.889 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../08-librados2_20.2.0-721-g5bb32787-1jammy_amd64.deb ... 2026-03-31T20:20:56.892 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking librados2 (20.2.0-721-g5bb32787-1jammy) over (17.2.9-0ubuntu0.22.04.2) ... 2026-03-31T20:20:56.951 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libnbd0. 2026-03-31T20:20:56.957 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../09-libnbd0_1.10.5-1_amd64.deb ... 2026-03-31T20:20:56.958 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libnbd0 (1.10.5-1) ... 2026-03-31T20:20:56.976 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libcephfs2. 2026-03-31T20:20:56.982 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../10-libcephfs2_20.2.0-721-g5bb32787-1jammy_amd64.deb ... 2026-03-31T20:20:56.982 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libcephfs2 (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:20:57.008 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-rados. 2026-03-31T20:20:57.015 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../11-python3-rados_20.2.0-721-g5bb32787-1jammy_amd64.deb ... 2026-03-31T20:20:57.015 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-rados (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:20:57.035 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-ceph-argparse. 2026-03-31T20:20:57.043 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../12-python3-ceph-argparse_20.2.0-721-g5bb32787-1jammy_all.deb ... 2026-03-31T20:20:57.044 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-ceph-argparse (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:20:57.058 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-cephfs. 2026-03-31T20:20:57.064 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../13-python3-cephfs_20.2.0-721-g5bb32787-1jammy_amd64.deb ... 2026-03-31T20:20:57.065 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-cephfs (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:20:57.081 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-ceph-common. 2026-03-31T20:20:57.089 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../14-python3-ceph-common_20.2.0-721-g5bb32787-1jammy_all.deb ... 2026-03-31T20:20:57.089 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-ceph-common (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:20:57.113 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-wcwidth. 2026-03-31T20:20:57.120 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../15-python3-wcwidth_0.2.5+dfsg1-1_all.deb ... 2026-03-31T20:20:57.121 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-31T20:20:57.151 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-prettytable. 2026-03-31T20:20:57.157 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../16-python3-prettytable_2.5.0-2_all.deb ... 2026-03-31T20:20:57.158 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-prettytable (2.5.0-2) ... 2026-03-31T20:20:57.174 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-rbd. 2026-03-31T20:20:57.180 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../17-python3-rbd_20.2.0-721-g5bb32787-1jammy_amd64.deb ... 2026-03-31T20:20:57.181 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-rbd (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:20:57.204 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package librdkafka1:amd64. 2026-03-31T20:20:57.211 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../18-librdkafka1_1.8.0-1build1_amd64.deb ... 2026-03-31T20:20:57.212 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-31T20:20:57.235 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package librgw2. 2026-03-31T20:20:57.242 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../19-librgw2_20.2.0-721-g5bb32787-1jammy_amd64.deb ... 2026-03-31T20:20:57.243 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking librgw2 (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:20:57.416 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-rgw. 2026-03-31T20:20:57.421 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../20-python3-rgw_20.2.0-721-g5bb32787-1jammy_amd64.deb ... 2026-03-31T20:20:57.422 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-rgw (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:20:57.480 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package liboath0:amd64. 2026-03-31T20:20:57.487 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../21-liboath0_2.6.7-3ubuntu0.1_amd64.deb ... 2026-03-31T20:20:57.487 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-31T20:20:57.503 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libradosstriper1. 2026-03-31T20:20:57.511 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../22-libradosstriper1_20.2.0-721-g5bb32787-1jammy_amd64.deb ... 2026-03-31T20:20:57.512 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libradosstriper1 (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:20:57.533 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-common. 2026-03-31T20:20:57.539 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../23-ceph-common_20.2.0-721-g5bb32787-1jammy_amd64.deb ... 2026-03-31T20:20:57.541 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-common (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:20:58.167 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-base. 2026-03-31T20:20:58.175 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../24-ceph-base_20.2.0-721-g5bb32787-1jammy_amd64.deb ... 2026-03-31T20:20:58.180 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-base (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:20:58.300 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-jaraco.functools. 2026-03-31T20:20:58.308 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../25-python3-jaraco.functools_3.4.0-2_all.deb ... 2026-03-31T20:20:58.308 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-jaraco.functools (3.4.0-2) ... 2026-03-31T20:20:58.325 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-cheroot. 2026-03-31T20:20:58.330 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../26-python3-cheroot_8.5.2+ds1-1ubuntu3.2_all.deb ... 2026-03-31T20:20:58.331 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-cheroot (8.5.2+ds1-1ubuntu3.2) ... 2026-03-31T20:20:58.350 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-jaraco.classes. 2026-03-31T20:20:58.356 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../27-python3-jaraco.classes_3.2.1-3_all.deb ... 2026-03-31T20:20:58.356 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-jaraco.classes (3.2.1-3) ... 2026-03-31T20:20:58.372 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-jaraco.text. 2026-03-31T20:20:58.379 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../28-python3-jaraco.text_3.6.0-2_all.deb ... 2026-03-31T20:20:58.380 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-jaraco.text (3.6.0-2) ... 2026-03-31T20:20:58.394 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-jaraco.collections. 2026-03-31T20:20:58.400 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../29-python3-jaraco.collections_3.4.0-2_all.deb ... 2026-03-31T20:20:58.401 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-jaraco.collections (3.4.0-2) ... 2026-03-31T20:20:58.416 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-tempora. 2026-03-31T20:20:58.423 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../30-python3-tempora_4.1.2-1_all.deb ... 2026-03-31T20:20:58.424 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-tempora (4.1.2-1) ... 2026-03-31T20:20:58.440 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-portend. 2026-03-31T20:20:58.446 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../31-python3-portend_3.0.0-1_all.deb ... 2026-03-31T20:20:58.447 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-portend (3.0.0-1) ... 2026-03-31T20:20:58.461 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-zc.lockfile. 2026-03-31T20:20:58.467 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../32-python3-zc.lockfile_2.0-1_all.deb ... 2026-03-31T20:20:58.468 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-zc.lockfile (2.0-1) ... 2026-03-31T20:20:58.483 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-cherrypy3. 2026-03-31T20:20:58.489 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../33-python3-cherrypy3_18.6.1-4_all.deb ... 2026-03-31T20:20:58.490 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-cherrypy3 (18.6.1-4) ... 2026-03-31T20:20:58.521 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-natsort. 2026-03-31T20:20:58.527 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../34-python3-natsort_8.0.2-1_all.deb ... 2026-03-31T20:20:58.528 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-natsort (8.0.2-1) ... 2026-03-31T20:20:58.545 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-mgr-modules-core. 2026-03-31T20:20:58.550 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../35-ceph-mgr-modules-core_20.2.0-721-g5bb32787-1jammy_all.deb ... 2026-03-31T20:20:58.551 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-mgr-modules-core (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:20:58.585 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libsqlite3-mod-ceph. 2026-03-31T20:20:58.592 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../36-libsqlite3-mod-ceph_20.2.0-721-g5bb32787-1jammy_amd64.deb ... 2026-03-31T20:20:58.592 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libsqlite3-mod-ceph (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:20:58.612 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-mgr. 2026-03-31T20:20:58.618 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../37-ceph-mgr_20.2.0-721-g5bb32787-1jammy_amd64.deb ... 2026-03-31T20:20:58.619 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-mgr (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:20:58.645 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-mon. 2026-03-31T20:20:58.651 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../38-ceph-mon_20.2.0-721-g5bb32787-1jammy_amd64.deb ... 2026-03-31T20:20:58.651 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-mon (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:20:58.736 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libfuse2:amd64. 2026-03-31T20:20:58.742 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../39-libfuse2_2.9.9-5ubuntu3_amd64.deb ... 2026-03-31T20:20:58.743 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-31T20:20:58.762 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-osd. 2026-03-31T20:20:58.767 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../40-ceph-osd_20.2.0-721-g5bb32787-1jammy_amd64.deb ... 2026-03-31T20:20:58.768 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-osd (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:20:58.990 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph. 2026-03-31T20:20:58.996 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../41-ceph_20.2.0-721-g5bb32787-1jammy_amd64.deb ... 2026-03-31T20:20:58.997 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:20:59.013 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-fuse. 2026-03-31T20:20:59.019 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../42-ceph-fuse_20.2.0-721-g5bb32787-1jammy_amd64.deb ... 2026-03-31T20:20:59.019 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-fuse (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:20:59.047 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-mds. 2026-03-31T20:20:59.053 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../43-ceph-mds_20.2.0-721-g5bb32787-1jammy_amd64.deb ... 2026-03-31T20:20:59.054 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-mds (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:20:59.097 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package cephadm. 2026-03-31T20:20:59.102 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../44-cephadm_20.2.0-721-g5bb32787-1jammy_amd64.deb ... 2026-03-31T20:20:59.103 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking cephadm (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:20:59.122 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-asyncssh. 2026-03-31T20:20:59.128 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../45-python3-asyncssh_2.5.0-1ubuntu0.1_all.deb ... 2026-03-31T20:20:59.128 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-31T20:20:59.156 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-mgr-cephadm. 2026-03-31T20:20:59.162 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../46-ceph-mgr-cephadm_20.2.0-721-g5bb32787-1jammy_all.deb ... 2026-03-31T20:20:59.163 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-mgr-cephadm (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:20:59.189 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-repoze.lru. 2026-03-31T20:20:59.196 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../47-python3-repoze.lru_0.7-2_all.deb ... 2026-03-31T20:20:59.196 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-repoze.lru (0.7-2) ... 2026-03-31T20:20:59.214 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-routes. 2026-03-31T20:20:59.220 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../48-python3-routes_2.5.1-1ubuntu1_all.deb ... 2026-03-31T20:20:59.221 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-routes (2.5.1-1ubuntu1) ... 2026-03-31T20:20:59.321 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-mgr-dashboard. 2026-03-31T20:20:59.327 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../49-ceph-mgr-dashboard_20.2.0-721-g5bb32787-1jammy_all.deb ... 2026-03-31T20:20:59.328 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-mgr-dashboard (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:20:59.959 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-sklearn-lib:amd64. 2026-03-31T20:20:59.965 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../50-python3-sklearn-lib_0.23.2-5ubuntu6_amd64.deb ... 2026-03-31T20:20:59.966 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-31T20:21:00.022 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-joblib. 2026-03-31T20:21:00.030 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../51-python3-joblib_0.17.0-4ubuntu1_all.deb ... 2026-03-31T20:21:00.031 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-joblib (0.17.0-4ubuntu1) ... 2026-03-31T20:21:00.064 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-threadpoolctl. 2026-03-31T20:21:00.071 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../52-python3-threadpoolctl_3.1.0-1_all.deb ... 2026-03-31T20:21:00.072 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-threadpoolctl (3.1.0-1) ... 2026-03-31T20:21:00.088 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-sklearn. 2026-03-31T20:21:00.095 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../53-python3-sklearn_0.23.2-5ubuntu6_all.deb ... 2026-03-31T20:21:00.096 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-31T20:21:00.220 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-mgr-diskprediction-local. 2026-03-31T20:21:00.228 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../54-ceph-mgr-diskprediction-local_20.2.0-721-g5bb32787-1jammy_all.deb ... 2026-03-31T20:21:00.229 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-mgr-diskprediction-local (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:00.483 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-cachetools. 2026-03-31T20:21:00.489 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../55-python3-cachetools_5.0.0-1_all.deb ... 2026-03-31T20:21:00.490 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-cachetools (5.0.0-1) ... 2026-03-31T20:21:00.506 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-rsa. 2026-03-31T20:21:00.513 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../56-python3-rsa_4.8-1_all.deb ... 2026-03-31T20:21:00.514 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-rsa (4.8-1) ... 2026-03-31T20:21:00.532 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-google-auth. 2026-03-31T20:21:00.538 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../57-python3-google-auth_1.5.1-3_all.deb ... 2026-03-31T20:21:00.538 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-google-auth (1.5.1-3) ... 2026-03-31T20:21:00.559 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-requests-oauthlib. 2026-03-31T20:21:00.564 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../58-python3-requests-oauthlib_1.3.0+ds-0.1_all.deb ... 2026-03-31T20:21:00.565 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-31T20:21:00.583 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-websocket. 2026-03-31T20:21:00.589 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../59-python3-websocket_1.2.3-1_all.deb ... 2026-03-31T20:21:00.590 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-websocket (1.2.3-1) ... 2026-03-31T20:21:00.611 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-kubernetes. 2026-03-31T20:21:00.616 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../60-python3-kubernetes_12.0.1-1ubuntu1_all.deb ... 2026-03-31T20:21:00.617 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-31T20:21:00.754 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-mgr-k8sevents. 2026-03-31T20:21:00.761 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../61-ceph-mgr-k8sevents_20.2.0-721-g5bb32787-1jammy_all.deb ... 2026-03-31T20:21:00.762 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-mgr-k8sevents (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:00.778 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libonig5:amd64. 2026-03-31T20:21:00.784 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../62-libonig5_6.9.7.1-2build1_amd64.deb ... 2026-03-31T20:21:00.785 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-31T20:21:00.803 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libjq1:amd64. 2026-03-31T20:21:00.809 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../63-libjq1_1.6-2.1ubuntu3.1_amd64.deb ... 2026-03-31T20:21:00.810 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-31T20:21:00.824 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package jq. 2026-03-31T20:21:00.830 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../64-jq_1.6-2.1ubuntu3.1_amd64.deb ... 2026-03-31T20:21:00.832 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking jq (1.6-2.1ubuntu3.1) ... 2026-03-31T20:21:00.846 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package socat. 2026-03-31T20:21:00.852 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../65-socat_1.7.4.1-3ubuntu4_amd64.deb ... 2026-03-31T20:21:00.853 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking socat (1.7.4.1-3ubuntu4) ... 2026-03-31T20:21:00.878 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package xmlstarlet. 2026-03-31T20:21:00.884 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../66-xmlstarlet_1.6.1-2.1_amd64.deb ... 2026-03-31T20:21:00.885 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking xmlstarlet (1.6.1-2.1) ... 2026-03-31T20:21:00.936 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-test. 2026-03-31T20:21:00.943 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../67-ceph-test_20.2.0-721-g5bb32787-1jammy_amd64.deb ... 2026-03-31T20:21:00.944 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-test (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:02.196 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-volume. 2026-03-31T20:21:02.202 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../68-ceph-volume_20.2.0-721-g5bb32787-1jammy_all.deb ... 2026-03-31T20:21:02.203 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-volume (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:02.232 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libcephfs-daemon. 2026-03-31T20:21:02.239 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../69-libcephfs-daemon_20.2.0-721-g5bb32787-1jammy_amd64.deb ... 2026-03-31T20:21:02.240 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libcephfs-daemon (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:02.259 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libcephfs-proxy2. 2026-03-31T20:21:02.267 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../70-libcephfs-proxy2_20.2.0-721-g5bb32787-1jammy_amd64.deb ... 2026-03-31T20:21:02.268 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libcephfs-proxy2 (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:02.285 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libcephfs-dev. 2026-03-31T20:21:02.291 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../71-libcephfs-dev_20.2.0-721-g5bb32787-1jammy_amd64.deb ... 2026-03-31T20:21:02.292 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libcephfs-dev (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:02.310 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package nvme-cli. 2026-03-31T20:21:02.315 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../72-nvme-cli_1.16-3ubuntu0.3_amd64.deb ... 2026-03-31T20:21:02.316 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking nvme-cli (1.16-3ubuntu0.3) ... 2026-03-31T20:21:02.353 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python-asyncssh-doc. 2026-03-31T20:21:02.360 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../73-python-asyncssh-doc_2.5.0-1ubuntu0.1_all.deb ... 2026-03-31T20:21:02.361 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-31T20:21:02.399 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-iniconfig. 2026-03-31T20:21:02.405 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../74-python3-iniconfig_1.1.1-2_all.deb ... 2026-03-31T20:21:02.406 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-iniconfig (1.1.1-2) ... 2026-03-31T20:21:02.423 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-pluggy. 2026-03-31T20:21:02.428 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../75-python3-pluggy_0.13.0-7.1_all.deb ... 2026-03-31T20:21:02.429 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-pluggy (0.13.0-7.1) ... 2026-03-31T20:21:02.446 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-psutil. 2026-03-31T20:21:02.453 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../76-python3-psutil_5.9.0-1build1_amd64.deb ... 2026-03-31T20:21:02.454 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-psutil (5.9.0-1build1) ... 2026-03-31T20:21:02.475 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-py. 2026-03-31T20:21:02.481 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../77-python3-py_1.10.0-1_all.deb ... 2026-03-31T20:21:02.483 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-py (1.10.0-1) ... 2026-03-31T20:21:02.506 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-pygments. 2026-03-31T20:21:02.512 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../78-python3-pygments_2.11.2+dfsg-2ubuntu0.1_all.deb ... 2026-03-31T20:21:02.513 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-pygments (2.11.2+dfsg-2ubuntu0.1) ... 2026-03-31T20:21:02.570 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-toml. 2026-03-31T20:21:02.576 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../79-python3-toml_0.10.2-1_all.deb ... 2026-03-31T20:21:02.577 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-toml (0.10.2-1) ... 2026-03-31T20:21:02.593 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-pytest. 2026-03-31T20:21:02.599 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../80-python3-pytest_6.2.5-1ubuntu2_all.deb ... 2026-03-31T20:21:02.600 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-pytest (6.2.5-1ubuntu2) ... 2026-03-31T20:21:02.638 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-simplejson. 2026-03-31T20:21:02.644 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../81-python3-simplejson_3.17.6-1build1_amd64.deb ... 2026-03-31T20:21:02.645 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-simplejson (3.17.6-1build1) ... 2026-03-31T20:21:02.667 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-webob. 2026-03-31T20:21:02.673 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../82-python3-webob_1%3a1.8.6-1.1ubuntu0.1_all.deb ... 2026-03-31T20:21:02.674 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-31T20:21:02.694 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package qttranslations5-l10n. 2026-03-31T20:21:02.698 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../83-qttranslations5-l10n_5.15.3-1_all.deb ... 2026-03-31T20:21:02.699 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking qttranslations5-l10n (5.15.3-1) ... 2026-03-31T20:21:02.793 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package radosgw. 2026-03-31T20:21:02.800 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../84-radosgw_20.2.0-721-g5bb32787-1jammy_amd64.deb ... 2026-03-31T20:21:02.800 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking radosgw (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:03.130 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package rbd-fuse. 2026-03-31T20:21:03.136 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../85-rbd-fuse_20.2.0-721-g5bb32787-1jammy_amd64.deb ... 2026-03-31T20:21:03.137 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking rbd-fuse (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:03.157 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package smartmontools. 2026-03-31T20:21:03.164 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../86-smartmontools_7.2-1ubuntu0.1_amd64.deb ... 2026-03-31T20:21:03.173 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking smartmontools (7.2-1ubuntu0.1) ... 2026-03-31T20:21:03.215 INFO:teuthology.orchestra.run.vm03.stdout:Setting up smartmontools (7.2-1ubuntu0.1) ... 2026-03-31T20:21:03.449 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/smartd.service → /lib/systemd/system/smartmontools.service. 2026-03-31T20:21:03.449 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/smartmontools.service → /lib/systemd/system/smartmontools.service. 2026-03-31T20:21:03.801 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-iniconfig (1.1.1-2) ... 2026-03-31T20:21:03.866 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-31T20:21:03.869 INFO:teuthology.orchestra.run.vm03.stdout:Setting up nvme-cli (1.16-3ubuntu0.3) ... 2026-03-31T20:21:03.937 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/default.target.wants/nvmefc-boot-connections.service → /lib/systemd/system/nvmefc-boot-connections.service. 2026-03-31T20:21:04.169 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/default.target.wants/nvmf-autoconnect.service → /lib/systemd/system/nvmf-autoconnect.service. 2026-03-31T20:21:04.541 INFO:teuthology.orchestra.run.vm03.stdout:nvmf-connect.target is a disabled or a static unit, not starting it. 2026-03-31T20:21:04.554 INFO:teuthology.orchestra.run.vm03.stdout:Setting up cephadm (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:04.599 INFO:teuthology.orchestra.run.vm03.stdout:Adding system user cephadm....done 2026-03-31T20:21:04.608 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-jaraco.classes (3.2.1-3) ... 2026-03-31T20:21:04.675 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-31T20:21:04.677 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-jaraco.functools (3.4.0-2) ... 2026-03-31T20:21:04.740 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-repoze.lru (0.7-2) ... 2026-03-31T20:21:04.809 INFO:teuthology.orchestra.run.vm03.stdout:Setting up liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-31T20:21:04.812 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-py (1.10.0-1) ... 2026-03-31T20:21:04.903 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-joblib (0.17.0-4ubuntu1) ... 2026-03-31T20:21:05.032 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-cachetools (5.0.0-1) ... 2026-03-31T20:21:05.105 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-threadpoolctl (3.1.0-1) ... 2026-03-31T20:21:05.174 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-ceph-argparse (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:05.245 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-31T20:21:05.247 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libnbd0 (1.10.5-1) ... 2026-03-31T20:21:05.250 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-31T20:21:05.252 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-31T20:21:05.254 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-psutil (5.9.0-1build1) ... 2026-03-31T20:21:05.375 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-natsort (8.0.2-1) ... 2026-03-31T20:21:05.447 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libcephfs-proxy2 (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:05.449 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-routes (2.5.1-1ubuntu1) ... 2026-03-31T20:21:05.518 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-simplejson (3.17.6-1build1) ... 2026-03-31T20:21:05.598 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-pygments (2.11.2+dfsg-2ubuntu0.1) ... 2026-03-31T20:21:05.866 INFO:teuthology.orchestra.run.vm03.stdout:Setting up qttranslations5-l10n (5.15.3-1) ... 2026-03-31T20:21:05.869 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-31T20:21:05.958 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-31T20:21:06.095 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-cheroot (8.5.2+ds1-1ubuntu3.2) ... 2026-03-31T20:21:06.178 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-jaraco.text (3.6.0-2) ... 2026-03-31T20:21:06.241 INFO:teuthology.orchestra.run.vm03.stdout:Setting up socat (1.7.4.1-3ubuntu4) ... 2026-03-31T20:21:06.244 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-ceph-common (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:06.341 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-31T20:21:06.898 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-31T20:21:06.903 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-toml (0.10.2-1) ... 2026-03-31T20:21:06.974 INFO:teuthology.orchestra.run.vm03.stdout:Setting up librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-31T20:21:06.976 INFO:teuthology.orchestra.run.vm03.stdout:Setting up xmlstarlet (1.6.1-2.1) ... 2026-03-31T20:21:06.979 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-pluggy (0.13.0-7.1) ... 2026-03-31T20:21:07.045 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-zc.lockfile (2.0-1) ... 2026-03-31T20:21:07.111 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-31T20:21:07.113 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-rsa (4.8-1) ... 2026-03-31T20:21:07.191 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-tempora (4.1.2-1) ... 2026-03-31T20:21:07.265 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-prettytable (2.5.0-2) ... 2026-03-31T20:21:07.338 INFO:teuthology.orchestra.run.vm03.stdout:Setting up liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-31T20:21:07.340 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-websocket (1.2.3-1) ... 2026-03-31T20:21:07.419 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-31T20:21:07.421 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-31T20:21:07.490 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-31T20:21:07.579 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-jaraco.collections (3.4.0-2) ... 2026-03-31T20:21:07.644 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-31T20:21:07.646 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-pytest (6.2.5-1ubuntu2) ... 2026-03-31T20:21:07.780 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-portend (3.0.0-1) ... 2026-03-31T20:21:07.846 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-31T20:21:07.848 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-google-auth (1.5.1-3) ... 2026-03-31T20:21:07.931 INFO:teuthology.orchestra.run.vm03.stdout:Setting up jq (1.6-2.1ubuntu3.1) ... 2026-03-31T20:21:07.933 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-cherrypy3 (18.6.1-4) ... 2026-03-31T20:21:08.059 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-31T20:21:08.062 INFO:teuthology.orchestra.run.vm03.stdout:Setting up librados2 (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:08.064 INFO:teuthology.orchestra.run.vm03.stdout:Setting up librgw2 (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:08.066 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libsqlite3-mod-ceph (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:08.068 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-31T20:21:08.623 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libcephfs2 (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:08.625 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libradosstriper1 (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:08.628 INFO:teuthology.orchestra.run.vm03.stdout:Setting up librbd1 (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:08.630 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-mgr-modules-core (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:08.632 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-fuse (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:08.691 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/remote-fs.target.wants/ceph-fuse.target → /lib/systemd/system/ceph-fuse.target. 2026-03-31T20:21:08.691 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-fuse.target → /lib/systemd/system/ceph-fuse.target. 2026-03-31T20:21:09.032 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libcephfs-dev (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:09.034 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-rados (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:09.036 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libcephfs-daemon (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:09.039 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-rbd (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:09.041 INFO:teuthology.orchestra.run.vm03.stdout:Setting up rbd-fuse (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:09.043 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-rgw (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:09.046 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-cephfs (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:09.048 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-common (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:09.080 INFO:teuthology.orchestra.run.vm03.stdout:Adding group ceph....done 2026-03-31T20:21:09.252 INFO:teuthology.orchestra.run.vm03.stdout:Adding system user ceph....done 2026-03-31T20:21:09.260 INFO:teuthology.orchestra.run.vm03.stdout:Setting system user ceph properties....done 2026-03-31T20:21:09.264 INFO:teuthology.orchestra.run.vm03.stdout:chown: cannot access '/var/log/ceph/*.log*': No such file or directory 2026-03-31T20:21:09.329 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /lib/systemd/system/ceph.target. 2026-03-31T20:21:09.547 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/rbdmap.service → /lib/systemd/system/rbdmap.service. 2026-03-31T20:21:09.926 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-test (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:09.928 INFO:teuthology.orchestra.run.vm03.stdout:Setting up radosgw (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:10.165 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /lib/systemd/system/ceph-radosgw.target. 2026-03-31T20:21:10.165 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /lib/systemd/system/ceph-radosgw.target. 2026-03-31T20:21:10.494 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-base (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:10.581 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /lib/systemd/system/ceph-crash.service. 2026-03-31T20:21:10.944 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-mds (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:11.007 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /lib/systemd/system/ceph-mds.target. 2026-03-31T20:21:11.007 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /lib/systemd/system/ceph-mds.target. 2026-03-31T20:21:11.381 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-mgr (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:11.451 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /lib/systemd/system/ceph-mgr.target. 2026-03-31T20:21:11.452 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /lib/systemd/system/ceph-mgr.target. 2026-03-31T20:21:11.833 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-osd (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:11.907 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /lib/systemd/system/ceph-osd.target. 2026-03-31T20:21:11.907 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /lib/systemd/system/ceph-osd.target. 2026-03-31T20:21:12.253 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-mgr-k8sevents (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:12.255 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-mgr-diskprediction-local (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:12.267 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-mon (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:12.325 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /lib/systemd/system/ceph-mon.target. 2026-03-31T20:21:12.325 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /lib/systemd/system/ceph-mon.target. 2026-03-31T20:21:12.706 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-mgr-cephadm (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:12.717 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:12.720 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-mgr-dashboard (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:12.731 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-volume (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:21:12.839 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-31T20:21:12.911 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-31T20:21:13.192 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:13.192 INFO:teuthology.orchestra.run.vm03.stdout:Running kernel seems to be up-to-date. 2026-03-31T20:21:13.192 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:13.192 INFO:teuthology.orchestra.run.vm03.stdout:Services to be restarted: 2026-03-31T20:21:13.195 INFO:teuthology.orchestra.run.vm03.stdout: systemctl restart apache-htcacheclean.service 2026-03-31T20:21:13.200 INFO:teuthology.orchestra.run.vm03.stdout: systemctl restart rsyslog.service 2026-03-31T20:21:13.202 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:13.202 INFO:teuthology.orchestra.run.vm03.stdout:Service restarts being deferred: 2026-03-31T20:21:13.202 INFO:teuthology.orchestra.run.vm03.stdout: systemctl restart networkd-dispatcher.service 2026-03-31T20:21:13.202 INFO:teuthology.orchestra.run.vm03.stdout: systemctl restart unattended-upgrades.service 2026-03-31T20:21:13.203 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:13.203 INFO:teuthology.orchestra.run.vm03.stdout:No containers need to be restarted. 2026-03-31T20:21:13.203 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:13.203 INFO:teuthology.orchestra.run.vm03.stdout:No user sessions are running outdated binaries. 2026-03-31T20:21:13.203 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:13.203 INFO:teuthology.orchestra.run.vm03.stdout:No VM guests are running outdated hypervisor (qemu) binaries on this host. 2026-03-31T20:21:13.869 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-31T20:21:13.871 DEBUG:teuthology.orchestra.run.vm03:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" install python3-jmespath python3-xmltodict s3cmd 2026-03-31T20:21:13.946 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-31T20:21:14.066 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-31T20:21:14.066 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-31T20:21:14.159 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-31T20:21:14.159 INFO:teuthology.orchestra.run.vm03.stdout: kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-31T20:21:14.159 INFO:teuthology.orchestra.run.vm03.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-31T20:21:14.159 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-31T20:21:14.170 INFO:teuthology.orchestra.run.vm03.stdout:The following NEW packages will be installed: 2026-03-31T20:21:14.170 INFO:teuthology.orchestra.run.vm03.stdout: python3-jmespath python3-xmltodict s3cmd 2026-03-31T20:21:14.396 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 3 newly installed, 0 to remove and 50 not upgraded. 2026-03-31T20:21:14.396 INFO:teuthology.orchestra.run.vm03.stdout:Need to get 155 kB of archives. 2026-03-31T20:21:14.396 INFO:teuthology.orchestra.run.vm03.stdout:After this operation, 678 kB of additional disk space will be used. 2026-03-31T20:21:14.396 INFO:teuthology.orchestra.run.vm03.stdout:Get:1 http://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jmespath all 0.10.0-1 [21.7 kB] 2026-03-31T20:21:14.627 INFO:teuthology.orchestra.run.vm03.stdout:Get:2 http://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-xmltodict all 0.12.0-2 [12.6 kB] 2026-03-31T20:21:14.649 INFO:teuthology.orchestra.run.vm03.stdout:Get:3 http://archive.ubuntu.com/ubuntu jammy/universe amd64 s3cmd all 2.2.0-1 [120 kB] 2026-03-31T20:21:15.057 INFO:teuthology.orchestra.run.vm03.stdout:Fetched 155 kB in 1s (216 kB/s) 2026-03-31T20:21:15.071 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-jmespath. 2026-03-31T20:21:15.097 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126082 files and directories currently installed.) 2026-03-31T20:21:15.099 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../python3-jmespath_0.10.0-1_all.deb ... 2026-03-31T20:21:15.100 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-jmespath (0.10.0-1) ... 2026-03-31T20:21:15.114 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-xmltodict. 2026-03-31T20:21:15.119 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../python3-xmltodict_0.12.0-2_all.deb ... 2026-03-31T20:21:15.120 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-xmltodict (0.12.0-2) ... 2026-03-31T20:21:15.132 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package s3cmd. 2026-03-31T20:21:15.137 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../archives/s3cmd_2.2.0-1_all.deb ... 2026-03-31T20:21:15.138 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking s3cmd (2.2.0-1) ... 2026-03-31T20:21:15.167 INFO:teuthology.orchestra.run.vm03.stdout:Setting up s3cmd (2.2.0-1) ... 2026-03-31T20:21:15.251 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-xmltodict (0.12.0-2) ... 2026-03-31T20:21:15.311 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-jmespath (0.10.0-1) ... 2026-03-31T20:21:15.376 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-31T20:21:15.669 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:15.669 INFO:teuthology.orchestra.run.vm03.stdout:Running kernel seems to be up-to-date. 2026-03-31T20:21:15.669 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:15.669 INFO:teuthology.orchestra.run.vm03.stdout:Services to be restarted: 2026-03-31T20:21:15.671 INFO:teuthology.orchestra.run.vm03.stdout: systemctl restart apache-htcacheclean.service 2026-03-31T20:21:15.676 INFO:teuthology.orchestra.run.vm03.stdout: systemctl restart rsyslog.service 2026-03-31T20:21:15.679 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:15.679 INFO:teuthology.orchestra.run.vm03.stdout:Service restarts being deferred: 2026-03-31T20:21:15.679 INFO:teuthology.orchestra.run.vm03.stdout: systemctl restart networkd-dispatcher.service 2026-03-31T20:21:15.679 INFO:teuthology.orchestra.run.vm03.stdout: systemctl restart unattended-upgrades.service 2026-03-31T20:21:15.679 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:15.679 INFO:teuthology.orchestra.run.vm03.stdout:No containers need to be restarted. 2026-03-31T20:21:15.679 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:15.679 INFO:teuthology.orchestra.run.vm03.stdout:No user sessions are running outdated binaries. 2026-03-31T20:21:15.679 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:15.679 INFO:teuthology.orchestra.run.vm03.stdout:No VM guests are running outdated hypervisor (qemu) binaries on this host. 2026-03-31T20:21:16.349 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-31T20:21:16.351 DEBUG:teuthology.parallel:result is None 2026-03-31T20:21:16.352 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&sha1=5bb3278730741031382ca9c3dc9d221a942e06a2 2026-03-31T20:21:16.929 DEBUG:teuthology.orchestra.run.vm03:> dpkg-query -W -f '${Version}' ceph 2026-03-31T20:21:16.937 INFO:teuthology.orchestra.run.vm03.stdout:20.2.0-721-g5bb32787-1jammy 2026-03-31T20:21:16.937 INFO:teuthology.packaging:The installed version of ceph is 20.2.0-721-g5bb32787-1jammy 2026-03-31T20:21:16.937 INFO:teuthology.task.install:The correct ceph version 20.2.0-721-g5bb32787-1jammy is installed. 2026-03-31T20:21:16.937 INFO:teuthology.task.install.util:Shipping valgrind.supp... 2026-03-31T20:21:16.938 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-31T20:21:16.938 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-31T20:21:16.989 INFO:teuthology.task.install.util:Shipping 'daemon-helper'... 2026-03-31T20:21:16.989 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-31T20:21:16.989 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/usr/bin/daemon-helper 2026-03-31T20:21:17.036 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-31T20:21:17.084 INFO:teuthology.task.install.util:Shipping 'adjust-ulimits'... 2026-03-31T20:21:17.084 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-31T20:21:17.084 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-31T20:21:17.132 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-31T20:21:17.184 INFO:teuthology.task.install.util:Shipping 'stdin-killer'... 2026-03-31T20:21:17.184 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-31T20:21:17.184 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/usr/bin/stdin-killer 2026-03-31T20:21:17.232 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-31T20:21:17.280 INFO:teuthology.run_tasks:Running task ceph... 2026-03-31T20:21:17.316 INFO:tasks.ceph:Making ceph log dir writeable by non-root... 2026-03-31T20:21:17.316 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod 777 /var/log/ceph 2026-03-31T20:21:17.329 INFO:tasks.ceph:Disabling ceph logrotate... 2026-03-31T20:21:17.329 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -f -- /etc/logrotate.d/ceph 2026-03-31T20:21:17.376 INFO:tasks.ceph:Creating extra log directories... 2026-03-31T20:21:17.377 DEBUG:teuthology.orchestra.run.vm03:> sudo install -d -m0777 -- /var/log/ceph/valgrind /var/log/ceph/profiling-logger 2026-03-31T20:21:17.428 INFO:tasks.ceph:Creating ceph cluster ceph... 2026-03-31T20:21:17.428 INFO:tasks.ceph:config {'conf': {'global': {'mon election default strategy': 3, 'ms bind msgr1': False, 'ms bind msgr2': True, 'ms type': 'async'}, 'mgr': {'debug mgr': 20, 'debug ms': 1}, 'mon': {'debug mon': 20, 'debug ms': 1, 'debug paxos': 20, 'mon scrub interval': 300}, 'osd': {'bdev async discard': True, 'bdev enable discard': True, 'bluestore allocator': 'avl', 'bluestore block size': 96636764160, 'bluestore compression algorithm': 'lz4', 'bluestore compression mode': 'aggressive', 'bluestore fsck on mount': True, 'bluestore onode segment size': '1024K', 'bluestore write v2': True, 'bluestore zero block detection': True, 'debug bluefs': 20, 'debug bluestore': 20, 'debug ms': 1, 'debug osd': 20, 'debug rocksdb': 10, 'mon osd backfillfull_ratio': 0.85, 'mon osd full ratio': 0.9, 'mon osd nearfull ratio': 0.8, 'osd debug verify cached snaps': True, 'osd debug verify missing on start': True, 'osd failsafe full ratio': 0.95, 'osd mclock iops capacity threshold hdd': 49000, 'osd mclock override recovery settings': True, 'osd mclock profile': 'high_recovery_ops', 'osd mclock skip benchmark': True, 'osd objectstore': 'bluestore', 'osd op queue': 'debug_random', 'osd op queue cut off': 'debug_random', 'osd_mclock_skip_benchmark': True}}, 'fs': 'xfs', 'mkfs_options': None, 'mount_options': None, 'skip_mgr_daemons': False, 'log_ignorelist': ['but it is still running', 'had wrong client addr', 'had wrong cluster addr', 'must scrub before tier agent can activate', 'failsafe engaged, dropping updates', 'failsafe disengaged, no longer dropping updates', 'overall HEALTH_', '\\(OSDMAP_FLAGS\\)', '\\(OSD_', '\\(PG_', '\\(SMALLER_PG_NUM\\)', '\\(SMALLER_PGP_NUM\\)', '\\(CACHE_POOL_NO_HIT_SET\\)', '\\(CACHE_POOL_NEAR_FULL\\)', '\\(FS_WITH_FAILED_MDS\\)', '\\(FS_DEGRADED\\)', '\\(POOL_BACKFILLFULL\\)', '\\(POOL_FULL\\)', '\\(SMALLER_PGP_NUM\\)', '\\(POOL_NEARFULL\\)', '\\(POOL_APP_NOT_ENABLED\\)', '\\(AUTH_BAD_CAPS\\)', '\\(FS_INLINE_DATA_DEPRECATED\\)', '\\(MON_DOWN\\)', '\\(SLOW_OPS\\)', 'slow request', '\\(MDS_ALL_DOWN\\)', '\\(MDS_UP_LESS_THAN_MAX\\)'], 'cpu_profile': set(), 'cluster': 'ceph', 'mon_bind_msgr2': True, 'mon_bind_addrvec': True} 2026-03-31T20:21:17.428 INFO:tasks.ceph:ctx.config {'archive_path': '/archive/kyr-2026-03-31_11:18:10-rados-tentacle-none-default-vps/4344', 'branch': 'tentacle', 'description': 'rados/singleton-bluestore/{all/cephtool mon_election/connectivity msgr-failures/none msgr/async-v2only objectstore/bluestore/{alloc$/{avl} base mem$/{normal-2} onode-segment$/{1M} write$/{v2/{compr$/{yes$/{lz4}} v2}}} rados supported-random-distro$/{ubuntu_latest}}', 'email': None, 'first_in_suite': False, 'flavor': 'default', 'job_id': '4344', 'ktype': 'distro', 'last_in_suite': False, 'machine_type': 'vps', 'name': 'kyr-2026-03-31_11:18:10-rados-tentacle-none-default-vps', 'no_nested_subset': False, 'openstack': [{'volumes': {'count': 3, 'size': 10}}], 'os_type': 'ubuntu', 'os_version': '22.04', 'overrides': {'admin_socket': {'branch': 'tentacle'}, 'ansible.cephlab': {'branch': 'main', 'repo': 'https://github.com/kshtsk/ceph-cm-ansible.git', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'logical_volumes': {'lv_1': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_2': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_3': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}, 'lv_4': {'scratch_dev': True, 'size': '25%VG', 'vg': 'vg_nvme'}}, 'timezone': 'UTC', 'volume_groups': {'vg_nvme': {'pvs': '/dev/vdb,/dev/vdc,/dev/vdd,/dev/vde'}}}}, 'ceph': {'conf': {'global': {'mon election default strategy': 3, 'ms bind msgr1': False, 'ms bind msgr2': True, 'ms type': 'async'}, 'mgr': {'debug mgr': 20, 'debug ms': 1}, 'mon': {'debug mon': 20, 'debug ms': 1, 'debug paxos': 20, 'mon scrub interval': 300}, 'osd': {'bdev async discard': True, 'bdev enable discard': True, 'bluestore allocator': 'avl', 'bluestore block size': 96636764160, 'bluestore compression algorithm': 'lz4', 'bluestore compression mode': 'aggressive', 'bluestore fsck on mount': True, 'bluestore onode segment size': '1024K', 'bluestore write v2': True, 'bluestore zero block detection': True, 'debug bluefs': 20, 'debug bluestore': 20, 'debug ms': 1, 'debug osd': 20, 'debug rocksdb': 10, 'mon osd backfillfull_ratio': 0.85, 'mon osd full ratio': 0.9, 'mon osd nearfull ratio': 0.8, 'osd debug verify cached snaps': True, 'osd debug verify missing on start': True, 'osd failsafe full ratio': 0.95, 'osd mclock iops capacity threshold hdd': 49000, 'osd mclock override recovery settings': True, 'osd mclock profile': 'high_recovery_ops', 'osd mclock skip benchmark': True, 'osd objectstore': 'bluestore', 'osd op queue': 'debug_random', 'osd op queue cut off': 'debug_random', 'osd_mclock_skip_benchmark': True}}, 'flavor': 'default', 'fs': 'xfs', 'log-ignorelist': ['\\(MDS_ALL_DOWN\\)', '\\(MDS_UP_LESS_THAN_MAX\\)'], 'sha1': '5bb3278730741031382ca9c3dc9d221a942e06a2'}, 'ceph-deploy': {'conf': {'client': {'log file': '/var/log/ceph/ceph-$name.$pid.log'}, 'mon': {}}}, 'cephadm': {'cephadm_binary_url': 'https://download.ceph.com/rpm-20.2.0/el9/noarch/cephadm'}, 'install': {'ceph': {'flavor': 'default', 'sha1': '5bb3278730741031382ca9c3dc9d221a942e06a2'}, 'extra_system_packages': {'deb': ['python3-jmespath', 'python3-xmltodict', 's3cmd'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-jmespath', 'python3-xmltodict', 's3cmd']}}, 'thrashosds': {'bdev_inject_crash': 2, 'bdev_inject_crash_probability': 0.5}, 'workunit': {'branch': 'tt-tentacle', 'sha1': '0392f78529848ec72469e8e431875cb98d3a5fb4'}}, 'owner': 'kyr', 'priority': 1000, 'repo': 'https://github.com/ceph/ceph.git', 'roles': [['mon.a', 'mon.b', 'mon.c', 'mgr.x', 'osd.0', 'osd.1', 'osd.2', 'client.0']], 'seed': 6407, 'sha1': '5bb3278730741031382ca9c3dc9d221a942e06a2', 'sleep_before_teardown': 0, 'subset': '1/100000', 'suite': 'rados', 'suite_branch': 'tt-tentacle', 'suite_path': '/home/teuthos/src/github.com_kshtsk_ceph_0392f78529848ec72469e8e431875cb98d3a5fb4/qa', 'suite_relpath': 'qa', 'suite_repo': 'https://github.com/kshtsk/ceph.git', 'suite_sha1': '0392f78529848ec72469e8e431875cb98d3a5fb4', 'targets': {'vm03.local': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJoxb4u06Bk6dqJOB3WKNas+bv6SaRmj8p0RqqE/tvYpF5y2y7o+v+bN/kaQLQPm2kj/uk0WUkoXH7Dw2J1ZUqM='}, 'tasks': [{'internal.check_packages': None}, {'internal.buildpackages_prep': None}, {'internal.save_config': None}, {'internal.check_lock': None}, {'internal.add_remotes': None}, {'console_log': None}, {'internal.connect': None}, {'internal.push_inventory': None}, {'internal.serialize_remote_roles': None}, {'internal.check_conflict': None}, {'internal.check_ceph_data': None}, {'internal.vm_setup': None}, {'internal.base': None}, {'internal.archive_upload': None}, {'internal.archive': None}, {'internal.coredump': None}, {'internal.sudo': None}, {'internal.syslog': None}, {'internal.timer': None}, {'pcp': None}, {'selinux': None}, {'ansible.cephlab': None}, {'clock': None}, {'install': None}, {'ceph': {'log-ignorelist': ['but it is still running', 'had wrong client addr', 'had wrong cluster addr', 'must scrub before tier agent can activate', 'failsafe engaged, dropping updates', 'failsafe disengaged, no longer dropping updates', 'overall HEALTH_', '\\(OSDMAP_FLAGS\\)', '\\(OSD_', '\\(PG_', '\\(SMALLER_PG_NUM\\)', '\\(SMALLER_PGP_NUM\\)', '\\(CACHE_POOL_NO_HIT_SET\\)', '\\(CACHE_POOL_NEAR_FULL\\)', '\\(FS_WITH_FAILED_MDS\\)', '\\(FS_DEGRADED\\)', '\\(POOL_BACKFILLFULL\\)', '\\(POOL_FULL\\)', '\\(SMALLER_PGP_NUM\\)', '\\(POOL_NEARFULL\\)', '\\(POOL_APP_NOT_ENABLED\\)', '\\(AUTH_BAD_CAPS\\)', '\\(FS_INLINE_DATA_DEPRECATED\\)', '\\(MON_DOWN\\)', '\\(SLOW_OPS\\)', 'slow request', '\\(MDS_ALL_DOWN\\)', '\\(MDS_UP_LESS_THAN_MAX\\)'], 'conf': {'global': {'mon election default strategy': 3, 'ms bind msgr1': False, 'ms bind msgr2': True, 'ms type': 'async'}, 'mgr': {'debug mgr': 20, 'debug ms': 1}, 'mon': {'debug mon': 20, 'debug ms': 1, 'debug paxos': 20, 'mon scrub interval': 300}, 'osd': {'bdev async discard': True, 'bdev enable discard': True, 'bluestore allocator': 'avl', 'bluestore block size': 96636764160, 'bluestore compression algorithm': 'lz4', 'bluestore compression mode': 'aggressive', 'bluestore fsck on mount': True, 'bluestore onode segment size': '1024K', 'bluestore write v2': True, 'bluestore zero block detection': True, 'debug bluefs': 20, 'debug bluestore': 20, 'debug ms': 1, 'debug osd': 20, 'debug rocksdb': 10, 'mon osd backfillfull_ratio': 0.85, 'mon osd full ratio': 0.9, 'mon osd nearfull ratio': 0.8, 'osd debug verify cached snaps': True, 'osd debug verify missing on start': True, 'osd failsafe full ratio': 0.95, 'osd mclock iops capacity threshold hdd': 49000, 'osd mclock override recovery settings': True, 'osd mclock profile': 'high_recovery_ops', 'osd mclock skip benchmark': True, 'osd objectstore': 'bluestore', 'osd op queue': 'debug_random', 'osd op queue cut off': 'debug_random', 'osd_mclock_skip_benchmark': True}}, 'flavor': 'default', 'fs': 'xfs', 'sha1': '5bb3278730741031382ca9c3dc9d221a942e06a2', 'cluster': 'ceph'}}, {'workunit': {'clients': {'all': ['cephtool', 'mon/pool_ops.sh']}}}], 'teuthology': {'fragments_dropped': [], 'meta': {}, 'postmerge': []}, 'teuthology_branch': 'uv2', 'teuthology_repo': 'https://github.com/kshtsk/teuthology', 'teuthology_sha1': 'a59626679648f962bca99d20d35578f2998c8f37', 'timestamp': '2026-03-31_11:18:10', 'tube': 'vps', 'user': 'kyr', 'verbose': False, 'worker_log': '/home/teuthos/.teuthology/dispatcher/dispatcher.vps.282426'} 2026-03-31T20:21:17.428 DEBUG:teuthology.orchestra.run.vm03:> install -d -m0755 -- /home/ubuntu/cephtest/ceph.data 2026-03-31T20:21:17.469 DEBUG:teuthology.orchestra.run.vm03:> sudo install -d -m0777 -- /var/run/ceph 2026-03-31T20:21:17.517 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-31T20:21:17.517 DEBUG:teuthology.orchestra.run.vm03:> dd if=/scratch_devs of=/dev/stdout 2026-03-31T20:21:17.561 DEBUG:teuthology.misc:devs=['/dev/vg_nvme/lv_1', '/dev/vg_nvme/lv_2', '/dev/vg_nvme/lv_3', '/dev/vg_nvme/lv_4'] 2026-03-31T20:21:17.561 DEBUG:teuthology.orchestra.run.vm03:> stat /dev/vg_nvme/lv_1 2026-03-31T20:21:17.605 INFO:teuthology.orchestra.run.vm03.stdout: File: /dev/vg_nvme/lv_1 -> ../dm-0 2026-03-31T20:21:17.605 INFO:teuthology.orchestra.run.vm03.stdout: Size: 7 Blocks: 0 IO Block: 4096 symbolic link 2026-03-31T20:21:17.605 INFO:teuthology.orchestra.run.vm03.stdout:Device: 5h/5d Inode: 778 Links: 1 2026-03-31T20:21:17.605 INFO:teuthology.orchestra.run.vm03.stdout:Access: (0777/lrwxrwxrwx) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-31T20:21:17.605 INFO:teuthology.orchestra.run.vm03.stdout:Access: 2026-03-31 20:20:17.885857000 +0000 2026-03-31T20:21:17.605 INFO:teuthology.orchestra.run.vm03.stdout:Modify: 2026-03-31 20:20:17.765857000 +0000 2026-03-31T20:21:17.605 INFO:teuthology.orchestra.run.vm03.stdout:Change: 2026-03-31 20:20:17.765857000 +0000 2026-03-31T20:21:17.605 INFO:teuthology.orchestra.run.vm03.stdout: Birth: - 2026-03-31T20:21:17.605 DEBUG:teuthology.orchestra.run.vm03:> sudo dd if=/dev/vg_nvme/lv_1 of=/dev/null count=1 2026-03-31T20:21:17.652 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records in 2026-03-31T20:21:17.652 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records out 2026-03-31T20:21:17.652 INFO:teuthology.orchestra.run.vm03.stderr:512 bytes copied, 0.000154509 s, 3.3 MB/s 2026-03-31T20:21:17.653 DEBUG:teuthology.orchestra.run.vm03:> ! mount | grep -v devtmpfs | grep -q /dev/vg_nvme/lv_1 2026-03-31T20:21:17.698 DEBUG:teuthology.orchestra.run.vm03:> stat /dev/vg_nvme/lv_2 2026-03-31T20:21:17.741 INFO:teuthology.orchestra.run.vm03.stdout: File: /dev/vg_nvme/lv_2 -> ../dm-1 2026-03-31T20:21:17.741 INFO:teuthology.orchestra.run.vm03.stdout: Size: 7 Blocks: 0 IO Block: 4096 symbolic link 2026-03-31T20:21:17.741 INFO:teuthology.orchestra.run.vm03.stdout:Device: 5h/5d Inode: 810 Links: 1 2026-03-31T20:21:17.741 INFO:teuthology.orchestra.run.vm03.stdout:Access: (0777/lrwxrwxrwx) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-31T20:21:17.741 INFO:teuthology.orchestra.run.vm03.stdout:Access: 2026-03-31 20:20:18.165857000 +0000 2026-03-31T20:21:17.741 INFO:teuthology.orchestra.run.vm03.stdout:Modify: 2026-03-31 20:20:18.037857000 +0000 2026-03-31T20:21:17.741 INFO:teuthology.orchestra.run.vm03.stdout:Change: 2026-03-31 20:20:18.037857000 +0000 2026-03-31T20:21:17.741 INFO:teuthology.orchestra.run.vm03.stdout: Birth: - 2026-03-31T20:21:17.741 DEBUG:teuthology.orchestra.run.vm03:> sudo dd if=/dev/vg_nvme/lv_2 of=/dev/null count=1 2026-03-31T20:21:17.789 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records in 2026-03-31T20:21:17.789 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records out 2026-03-31T20:21:17.789 INFO:teuthology.orchestra.run.vm03.stderr:512 bytes copied, 0.000180008 s, 2.8 MB/s 2026-03-31T20:21:17.790 DEBUG:teuthology.orchestra.run.vm03:> ! mount | grep -v devtmpfs | grep -q /dev/vg_nvme/lv_2 2026-03-31T20:21:17.833 DEBUG:teuthology.orchestra.run.vm03:> stat /dev/vg_nvme/lv_3 2026-03-31T20:21:17.877 INFO:teuthology.orchestra.run.vm03.stdout: File: /dev/vg_nvme/lv_3 -> ../dm-2 2026-03-31T20:21:17.877 INFO:teuthology.orchestra.run.vm03.stdout: Size: 7 Blocks: 0 IO Block: 4096 symbolic link 2026-03-31T20:21:17.877 INFO:teuthology.orchestra.run.vm03.stdout:Device: 5h/5d Inode: 843 Links: 1 2026-03-31T20:21:17.877 INFO:teuthology.orchestra.run.vm03.stdout:Access: (0777/lrwxrwxrwx) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-31T20:21:17.877 INFO:teuthology.orchestra.run.vm03.stdout:Access: 2026-03-31 20:20:18.445857000 +0000 2026-03-31T20:21:17.877 INFO:teuthology.orchestra.run.vm03.stdout:Modify: 2026-03-31 20:20:18.317857000 +0000 2026-03-31T20:21:17.877 INFO:teuthology.orchestra.run.vm03.stdout:Change: 2026-03-31 20:20:18.317857000 +0000 2026-03-31T20:21:17.877 INFO:teuthology.orchestra.run.vm03.stdout: Birth: - 2026-03-31T20:21:17.877 DEBUG:teuthology.orchestra.run.vm03:> sudo dd if=/dev/vg_nvme/lv_3 of=/dev/null count=1 2026-03-31T20:21:17.924 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records in 2026-03-31T20:21:17.924 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records out 2026-03-31T20:21:17.924 INFO:teuthology.orchestra.run.vm03.stderr:512 bytes copied, 0.000144801 s, 3.5 MB/s 2026-03-31T20:21:17.925 DEBUG:teuthology.orchestra.run.vm03:> ! mount | grep -v devtmpfs | grep -q /dev/vg_nvme/lv_3 2026-03-31T20:21:17.969 DEBUG:teuthology.orchestra.run.vm03:> stat /dev/vg_nvme/lv_4 2026-03-31T20:21:18.013 INFO:teuthology.orchestra.run.vm03.stdout: File: /dev/vg_nvme/lv_4 -> ../dm-3 2026-03-31T20:21:18.013 INFO:teuthology.orchestra.run.vm03.stdout: Size: 7 Blocks: 0 IO Block: 4096 symbolic link 2026-03-31T20:21:18.013 INFO:teuthology.orchestra.run.vm03.stdout:Device: 5h/5d Inode: 872 Links: 1 2026-03-31T20:21:18.013 INFO:teuthology.orchestra.run.vm03.stdout:Access: (0777/lrwxrwxrwx) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-31T20:21:18.013 INFO:teuthology.orchestra.run.vm03.stdout:Access: 2026-03-31 20:20:22.253857000 +0000 2026-03-31T20:21:18.013 INFO:teuthology.orchestra.run.vm03.stdout:Modify: 2026-03-31 20:20:18.617857000 +0000 2026-03-31T20:21:18.013 INFO:teuthology.orchestra.run.vm03.stdout:Change: 2026-03-31 20:20:18.617857000 +0000 2026-03-31T20:21:18.013 INFO:teuthology.orchestra.run.vm03.stdout: Birth: - 2026-03-31T20:21:18.013 DEBUG:teuthology.orchestra.run.vm03:> sudo dd if=/dev/vg_nvme/lv_4 of=/dev/null count=1 2026-03-31T20:21:18.061 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records in 2026-03-31T20:21:18.061 INFO:teuthology.orchestra.run.vm03.stderr:1+0 records out 2026-03-31T20:21:18.061 INFO:teuthology.orchestra.run.vm03.stderr:512 bytes copied, 0.000219511 s, 2.3 MB/s 2026-03-31T20:21:18.061 DEBUG:teuthology.orchestra.run.vm03:> ! mount | grep -v devtmpfs | grep -q /dev/vg_nvme/lv_4 2026-03-31T20:21:18.106 INFO:tasks.ceph:osd dev map: {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'} 2026-03-31T20:21:18.106 INFO:tasks.ceph:remote_to_roles_to_devs: {Remote(name='ubuntu@vm03.local'): {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'}} 2026-03-31T20:21:18.106 INFO:tasks.ceph:Generating config... 2026-03-31T20:21:18.106 INFO:tasks.ceph:[global] mon election default strategy = 3 2026-03-31T20:21:18.106 INFO:tasks.ceph:[global] ms bind msgr1 = False 2026-03-31T20:21:18.106 INFO:tasks.ceph:[global] ms bind msgr2 = True 2026-03-31T20:21:18.106 INFO:tasks.ceph:[global] ms type = async 2026-03-31T20:21:18.106 INFO:tasks.ceph:[mgr] debug mgr = 20 2026-03-31T20:21:18.106 INFO:tasks.ceph:[mgr] debug ms = 1 2026-03-31T20:21:18.106 INFO:tasks.ceph:[mon] debug mon = 20 2026-03-31T20:21:18.106 INFO:tasks.ceph:[mon] debug ms = 1 2026-03-31T20:21:18.106 INFO:tasks.ceph:[mon] debug paxos = 20 2026-03-31T20:21:18.107 INFO:tasks.ceph:[mon] mon scrub interval = 300 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] bdev async discard = True 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] bdev enable discard = True 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] bluestore allocator = avl 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] bluestore block size = 96636764160 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] bluestore compression algorithm = lz4 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] bluestore compression mode = aggressive 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] bluestore fsck on mount = True 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] bluestore onode segment size = 1024K 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] bluestore write v2 = True 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] bluestore zero block detection = True 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] debug bluefs = 20 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] debug bluestore = 20 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] debug ms = 1 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] debug osd = 20 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] debug rocksdb = 10 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] mon osd backfillfull_ratio = 0.85 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] mon osd full ratio = 0.9 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] mon osd nearfull ratio = 0.8 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] osd debug verify cached snaps = True 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] osd debug verify missing on start = True 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] osd failsafe full ratio = 0.95 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] osd mclock iops capacity threshold hdd = 49000 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] osd mclock override recovery settings = True 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] osd mclock profile = high_recovery_ops 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] osd mclock skip benchmark = True 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] osd objectstore = bluestore 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] osd op queue = debug_random 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] osd op queue cut off = debug_random 2026-03-31T20:21:18.107 INFO:tasks.ceph:[osd] osd_mclock_skip_benchmark = True 2026-03-31T20:21:18.107 INFO:tasks.ceph:Setting up mon.a... 2026-03-31T20:21:18.107 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool --create-keyring /etc/ceph/ceph.keyring 2026-03-31T20:21:18.162 INFO:teuthology.orchestra.run.vm03.stdout:creating /etc/ceph/ceph.keyring 2026-03-31T20:21:18.164 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool --gen-key --name=mon. /etc/ceph/ceph.keyring 2026-03-31T20:21:18.224 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod 0644 /etc/ceph/ceph.keyring 2026-03-31T20:21:18.273 DEBUG:tasks.ceph:Ceph mon addresses: [('mon.a', '192.168.123.103'), ('mon.b', '[v2:192.168.123.103:3301,v1:192.168.123.103:6790]'), ('mon.c', '[v2:192.168.123.103:3302,v1:192.168.123.103:6791]')] 2026-03-31T20:21:18.273 DEBUG:tasks.ceph:writing out conf {'global': {'chdir': '', 'pid file': '/var/run/ceph/$cluster-$name.pid', 'auth supported': 'cephx', 'filestore xattr use omap': 'true', 'mon clock drift allowed': '1.000', 'osd crush chooseleaf type': '0', 'auth debug': 'true', 'ms die on old message': 'true', 'ms die on bug': 'true', 'mon max pg per osd': '10000', 'mon pg warn max object skew': '0', 'osd_pool_default_pg_autoscale_mode': 'off', 'osd pool default size': '2', 'mon osd allow primary affinity': 'true', 'mon osd allow pg remap': 'true', 'mon warn on legacy crush tunables': 'false', 'mon warn on crush straw calc version zero': 'false', 'mon warn on no sortbitwise': 'false', 'mon warn on osd down out interval zero': 'false', 'mon warn on too few osds': 'false', 'mon_warn_on_pool_pg_num_not_power_of_two': 'false', 'mon_warn_on_pool_no_redundancy': 'false', 'mon_allow_pool_size_one': 'true', 'osd pool default erasure code profile': 'plugin=isa technique=reed_sol_van k=2 m=1 crush-failure-domain=osd', 'osd default data pool replay window': '5', 'mon allow pool delete': 'true', 'mon cluster log file level': 'debug', 'debug asserts on shutdown': 'true', 'mon health detail to clog': 'false', 'mon host': '192.168.123.103,[v2:192.168.123.103:3301,v1:192.168.123.103:6790],[v2:192.168.123.103:3302,v1:192.168.123.103:6791]', 'mon election default strategy': 3, 'ms bind msgr1': False, 'ms bind msgr2': True, 'ms type': 'async'}, 'osd': {'osd journal size': '100', 'osd scrub load threshold': '5.0', 'osd scrub max interval': '600', 'osd mclock profile': 'high_recovery_ops', 'osd mclock skip benchmark': True, 'osd recover clone overlap': 'true', 'osd recovery max chunk': '1048576', 'osd debug shutdown': 'true', 'osd debug op order': 'true', 'osd debug verify stray on activate': 'true', 'osd debug trim objects': 'true', 'osd open classes on start': 'true', 'osd debug pg log writeout': 'true', 'osd deep scrub update digest min age': '30', 'osd map max advance': '10', 'journal zero on create': 'true', 'filestore ondisk finisher threads': '3', 'filestore apply finisher threads': '3', 'bdev debug aio': 'true', 'osd debug misdirected ops': 'true', 'bdev async discard': True, 'bdev enable discard': True, 'bluestore allocator': 'avl', 'bluestore block size': 96636764160, 'bluestore compression algorithm': 'lz4', 'bluestore compression mode': 'aggressive', 'bluestore fsck on mount': True, 'bluestore onode segment size': '1024K', 'bluestore write v2': True, 'bluestore zero block detection': True, 'debug bluefs': 20, 'debug bluestore': 20, 'debug ms': 1, 'debug osd': 20, 'debug rocksdb': 10, 'mon osd backfillfull_ratio': 0.85, 'mon osd full ratio': 0.9, 'mon osd nearfull ratio': 0.8, 'osd debug verify cached snaps': True, 'osd debug verify missing on start': True, 'osd failsafe full ratio': 0.95, 'osd mclock iops capacity threshold hdd': 49000, 'osd mclock override recovery settings': True, 'osd objectstore': 'bluestore', 'osd op queue': 'debug_random', 'osd op queue cut off': 'debug_random', 'osd_mclock_skip_benchmark': True}, 'mgr': {'debug ms': 1, 'debug mgr': 20, 'debug mon': '20', 'debug auth': '20', 'mon reweight min pgs per osd': '4', 'mon reweight min bytes per osd': '10', 'mgr/telemetry/nag': 'false'}, 'mon': {'debug ms': 1, 'debug mon': 20, 'debug paxos': 20, 'debug auth': '20', 'mon data avail warn': '5', 'mon mgr mkfs grace': '240', 'mon reweight min pgs per osd': '4', 'mon osd reporter subtree level': 'osd', 'mon osd prime pg temp': 'true', 'mon reweight min bytes per osd': '10', 'auth mon ticket ttl': '660', 'auth service ticket ttl': '240', 'mon_warn_on_insecure_global_id_reclaim': 'false', 'mon_warn_on_insecure_global_id_reclaim_allowed': 'false', 'mon_down_mkfs_grace': '2m', 'mon_warn_on_filestore_osds': 'false', 'mon scrub interval': 300}, 'client': {'rgw cache enabled': 'true', 'rgw enable ops log': 'true', 'rgw enable usage log': 'true', 'log file': '/var/log/ceph/$cluster-$name.$pid.log', 'admin socket': '/var/run/ceph/$cluster-$name.$pid.asok'}, 'mon.a': {}, 'mon.b': {}, 'mon.c': {}} 2026-03-31T20:21:18.273 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-31T20:21:18.273 DEBUG:teuthology.orchestra.run.vm03:> dd of=/home/ubuntu/cephtest/ceph.tmp.conf 2026-03-31T20:21:18.317 DEBUG:teuthology.orchestra.run.vm03:> adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage monmaptool -c /home/ubuntu/cephtest/ceph.tmp.conf --create --clobber --enable-all-features --add a 192.168.123.103 --addv b '[v2:192.168.123.103:3301,v1:192.168.123.103:6790]' --addv c '[v2:192.168.123.103:3302,v1:192.168.123.103:6791]' --print /home/ubuntu/cephtest/ceph.monmap 2026-03-31T20:21:18.376 INFO:teuthology.orchestra.run.vm03.stdout:monmaptool: monmap file /home/ubuntu/cephtest/ceph.monmap 2026-03-31T20:21:18.376 INFO:teuthology.orchestra.run.vm03.stdout:monmaptool: generated fsid a4a0ca01-ae82-443e-a7c7-50605716689a 2026-03-31T20:21:18.376 INFO:teuthology.orchestra.run.vm03.stdout:setting min_mon_release = tentacle 2026-03-31T20:21:18.376 INFO:teuthology.orchestra.run.vm03.stdout:epoch 0 2026-03-31T20:21:18.376 INFO:teuthology.orchestra.run.vm03.stdout:fsid a4a0ca01-ae82-443e-a7c7-50605716689a 2026-03-31T20:21:18.376 INFO:teuthology.orchestra.run.vm03.stdout:last_changed 2026-03-31T20:21:18.374590+0000 2026-03-31T20:21:18.376 INFO:teuthology.orchestra.run.vm03.stdout:created 2026-03-31T20:21:18.374590+0000 2026-03-31T20:21:18.376 INFO:teuthology.orchestra.run.vm03.stdout:min_mon_release 20 (tentacle) 2026-03-31T20:21:18.376 INFO:teuthology.orchestra.run.vm03.stdout:election_strategy: 3 2026-03-31T20:21:18.376 INFO:teuthology.orchestra.run.vm03.stdout:0: [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] mon.a 2026-03-31T20:21:18.376 INFO:teuthology.orchestra.run.vm03.stdout:1: [v2:192.168.123.103:3301/0,v1:192.168.123.103:6790/0] mon.b 2026-03-31T20:21:18.376 INFO:teuthology.orchestra.run.vm03.stdout:2: [v2:192.168.123.103:3302/0,v1:192.168.123.103:6791/0] mon.c 2026-03-31T20:21:18.376 INFO:teuthology.orchestra.run.vm03.stdout:monmaptool: writing epoch 0 to /home/ubuntu/cephtest/ceph.monmap (3 monitors) 2026-03-31T20:21:18.377 DEBUG:teuthology.orchestra.run.vm03:> rm -- /home/ubuntu/cephtest/ceph.tmp.conf 2026-03-31T20:21:18.421 INFO:tasks.ceph:Writing /etc/ceph/ceph.conf for FSID a4a0ca01-ae82-443e-a7c7-50605716689a... 2026-03-31T20:21:18.421 DEBUG:teuthology.orchestra.run.vm03:> sudo mkdir -p /etc/ceph && sudo chmod 0755 /etc/ceph && sudo tee /etc/ceph/ceph.conf && sudo chmod 0644 /etc/ceph/ceph.conf > /dev/null 2026-03-31T20:21:18.477 INFO:teuthology.orchestra.run.vm03.stdout:[global] 2026-03-31T20:21:18.477 INFO:teuthology.orchestra.run.vm03.stdout: chdir = "" 2026-03-31T20:21:18.477 INFO:teuthology.orchestra.run.vm03.stdout: pid file = /var/run/ceph/$cluster-$name.pid 2026-03-31T20:21:18.477 INFO:teuthology.orchestra.run.vm03.stdout: auth supported = cephx 2026-03-31T20:21:18.477 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.477 INFO:teuthology.orchestra.run.vm03.stdout: filestore xattr use omap = true 2026-03-31T20:21:18.477 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.477 INFO:teuthology.orchestra.run.vm03.stdout: mon clock drift allowed = 1.000 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: osd crush chooseleaf type = 0 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: auth debug = true 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: ms die on old message = true 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: ms die on bug = true 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: mon max pg per osd = 10000 # >= luminous 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: mon pg warn max object skew = 0 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: # disable pg_autoscaler by default for new pools 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: osd_pool_default_pg_autoscale_mode = off 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: osd pool default size = 2 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: mon osd allow primary affinity = true 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: mon osd allow pg remap = true 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: mon warn on legacy crush tunables = false 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: mon warn on crush straw calc version zero = false 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: mon warn on no sortbitwise = false 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: mon warn on osd down out interval zero = false 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: mon warn on too few osds = false 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: mon_warn_on_pool_pg_num_not_power_of_two = false 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: mon_warn_on_pool_no_redundancy = false 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: mon_allow_pool_size_one = true 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: osd pool default erasure code profile = plugin=isa technique=reed_sol_van k=2 m=1 crush-failure-domain=osd 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: osd default data pool replay window = 5 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: mon allow pool delete = true 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: mon cluster log file level = debug 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: debug asserts on shutdown = true 2026-03-31T20:21:18.478 INFO:teuthology.orchestra.run.vm03.stdout: mon health detail to clog = false 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: mon host = "192.168.123.103,[v2:192.168.123.103:3301,v1:192.168.123.103:6790],[v2:192.168.123.103:3302,v1:192.168.123.103:6791]" 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: mon election default strategy = 3 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: ms bind msgr1 = False 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: ms bind msgr2 = True 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: ms type = async 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: fsid = a4a0ca01-ae82-443e-a7c7-50605716689a 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout:[osd] 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: osd journal size = 100 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: osd scrub load threshold = 5.0 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: osd scrub max interval = 600 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: osd mclock profile = high_recovery_ops 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: osd mclock skip benchmark = True 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: osd recover clone overlap = true 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: osd recovery max chunk = 1048576 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: osd debug shutdown = true 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: osd debug op order = true 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: osd debug verify stray on activate = true 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: osd debug trim objects = true 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: osd open classes on start = true 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: osd debug pg log writeout = true 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: osd deep scrub update digest min age = 30 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: osd map max advance = 10 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: journal zero on create = true 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: filestore ondisk finisher threads = 3 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: filestore apply finisher threads = 3 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: bdev debug aio = true 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: osd debug misdirected ops = true 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: bdev async discard = True 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: bdev enable discard = True 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: bluestore allocator = avl 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: bluestore block size = 96636764160 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: bluestore compression algorithm = lz4 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: bluestore compression mode = aggressive 2026-03-31T20:21:18.479 INFO:teuthology.orchestra.run.vm03.stdout: bluestore fsck on mount = True 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: bluestore onode segment size = 1024K 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: bluestore write v2 = True 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: bluestore zero block detection = True 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: debug bluefs = 20 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: debug bluestore = 20 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: debug ms = 1 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: debug osd = 20 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: debug rocksdb = 10 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: mon osd backfillfull_ratio = 0.85 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: mon osd full ratio = 0.9 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: mon osd nearfull ratio = 0.8 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: osd debug verify cached snaps = True 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: osd debug verify missing on start = True 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: osd failsafe full ratio = 0.95 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: osd mclock iops capacity threshold hdd = 49000 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: osd mclock override recovery settings = True 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: osd objectstore = bluestore 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: osd op queue = debug_random 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: osd op queue cut off = debug_random 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: osd_mclock_skip_benchmark = True 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout:[mgr] 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: debug ms = 1 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: debug mgr = 20 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: debug mon = 20 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: debug auth = 20 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: mon reweight min pgs per osd = 4 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: mon reweight min bytes per osd = 10 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: mgr/telemetry/nag = false 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout:[mon] 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: debug ms = 1 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: debug mon = 20 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: debug paxos = 20 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: debug auth = 20 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: mon data avail warn = 5 2026-03-31T20:21:18.480 INFO:teuthology.orchestra.run.vm03.stdout: mon mgr mkfs grace = 240 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout: mon reweight min pgs per osd = 4 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout: mon osd reporter subtree level = osd 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout: mon osd prime pg temp = true 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout: mon reweight min bytes per osd = 10 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout: # rotate auth tickets quickly to exercise renewal paths 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout: auth mon ticket ttl = 660 # 11m 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout: auth service ticket ttl = 240 # 4m 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout: # don't complain about insecure global_id in the test suite 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout: mon_warn_on_insecure_global_id_reclaim = false 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout: mon_warn_on_insecure_global_id_reclaim_allowed = false 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout: # 1m isn't quite enough 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout: mon_down_mkfs_grace = 2m 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout: mon_warn_on_filestore_osds = false 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout: mon scrub interval = 300 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout:[client] 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout: rgw cache enabled = true 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout: rgw enable ops log = true 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout: rgw enable usage log = true 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout: log file = /var/log/ceph/$cluster-$name.$pid.log 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout: admin socket = /var/run/ceph/$cluster-$name.$pid.asok 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout:[mon.a] 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout:[mon.b] 2026-03-31T20:21:18.481 INFO:teuthology.orchestra.run.vm03.stdout:[mon.c] 2026-03-31T20:21:18.483 INFO:tasks.ceph:Creating admin key on mon.a... 2026-03-31T20:21:18.484 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool --gen-key --name=client.admin --cap mon 'allow *' --cap osd 'allow *' --cap mds 'allow *' --cap mgr 'allow *' /etc/ceph/ceph.keyring 2026-03-31T20:21:18.547 INFO:tasks.ceph:Copying monmap to all nodes... 2026-03-31T20:21:18.547 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-31T20:21:18.548 DEBUG:teuthology.orchestra.run.vm03:> dd if=/etc/ceph/ceph.keyring of=/dev/stdout 2026-03-31T20:21:18.593 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-31T20:21:18.593 DEBUG:teuthology.orchestra.run.vm03:> dd if=/home/ubuntu/cephtest/ceph.monmap of=/dev/stdout 2026-03-31T20:21:18.638 INFO:tasks.ceph:Sending monmap to node ubuntu@vm03.local 2026-03-31T20:21:18.638 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-31T20:21:18.638 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/etc/ceph/ceph.keyring 2026-03-31T20:21:18.638 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod 0644 /etc/ceph/ceph.keyring 2026-03-31T20:21:18.692 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-31T20:21:18.692 DEBUG:teuthology.orchestra.run.vm03:> dd of=/home/ubuntu/cephtest/ceph.monmap 2026-03-31T20:21:18.738 INFO:tasks.ceph:Setting up mon nodes... 2026-03-31T20:21:18.738 INFO:tasks.ceph:Setting up mgr nodes... 2026-03-31T20:21:18.738 DEBUG:teuthology.orchestra.run.vm03:> sudo mkdir -p /var/lib/ceph/mgr/ceph-x && sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool --create-keyring --gen-key --name=mgr.x /var/lib/ceph/mgr/ceph-x/keyring 2026-03-31T20:21:18.801 INFO:teuthology.orchestra.run.vm03.stdout:creating /var/lib/ceph/mgr/ceph-x/keyring 2026-03-31T20:21:18.803 INFO:tasks.ceph:Setting up mds nodes... 2026-03-31T20:21:18.803 INFO:tasks.ceph_client:Setting up client nodes... 2026-03-31T20:21:18.803 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool --create-keyring --gen-key --name=client.0 /etc/ceph/ceph.client.0.keyring && sudo chmod 0644 /etc/ceph/ceph.client.0.keyring 2026-03-31T20:21:18.865 INFO:teuthology.orchestra.run.vm03.stdout:creating /etc/ceph/ceph.client.0.keyring 2026-03-31T20:21:18.872 INFO:tasks.ceph:Running mkfs on osd nodes... 2026-03-31T20:21:18.872 INFO:tasks.ceph:ctx.disk_config.remote_to_roles_to_dev: {Remote(name='ubuntu@vm03.local'): {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'}} 2026-03-31T20:21:18.872 DEBUG:teuthology.orchestra.run.vm03:> sudo mkdir -p /var/lib/ceph/osd/ceph-0 2026-03-31T20:21:18.921 INFO:tasks.ceph:roles_to_devs: {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'} 2026-03-31T20:21:18.921 INFO:tasks.ceph:role: osd.0 2026-03-31T20:21:18.921 INFO:tasks.ceph:['mkfs.xfs', '-f', '-i', 'size=2048'] on /dev/vg_nvme/lv_1 on ubuntu@vm03.local 2026-03-31T20:21:18.921 DEBUG:teuthology.orchestra.run.vm03:> yes | sudo mkfs.xfs -f -i size=2048 /dev/vg_nvme/lv_1 2026-03-31T20:21:18.971 INFO:teuthology.orchestra.run.vm03.stdout:meta-data=/dev/vg_nvme/lv_1 isize=2048 agcount=4, agsize=1310464 blks 2026-03-31T20:21:18.971 INFO:teuthology.orchestra.run.vm03.stdout: = sectsz=512 attr=2, projid32bit=1 2026-03-31T20:21:18.971 INFO:teuthology.orchestra.run.vm03.stdout: = crc=1 finobt=1, sparse=1, rmapbt=0 2026-03-31T20:21:18.971 INFO:teuthology.orchestra.run.vm03.stdout: = reflink=1 bigtime=0 inobtcount=0 2026-03-31T20:21:18.971 INFO:teuthology.orchestra.run.vm03.stdout:data = bsize=4096 blocks=5241856, imaxpct=25 2026-03-31T20:21:18.971 INFO:teuthology.orchestra.run.vm03.stdout: = sunit=0 swidth=0 blks 2026-03-31T20:21:18.971 INFO:teuthology.orchestra.run.vm03.stdout:naming =version 2 bsize=4096 ascii-ci=0, ftype=1 2026-03-31T20:21:18.971 INFO:teuthology.orchestra.run.vm03.stdout:log =internal log bsize=4096 blocks=2560, version=2 2026-03-31T20:21:18.971 INFO:teuthology.orchestra.run.vm03.stdout: = sectsz=512 sunit=0 blks, lazy-count=1 2026-03-31T20:21:18.971 INFO:teuthology.orchestra.run.vm03.stdout:realtime =none extsz=4096 blocks=0, rtextents=0 2026-03-31T20:21:18.979 INFO:teuthology.orchestra.run.vm03.stdout:Discarding blocks...Done. 2026-03-31T20:21:18.980 INFO:tasks.ceph:mount /dev/vg_nvme/lv_1 on ubuntu@vm03.local -o noatime 2026-03-31T20:21:18.980 DEBUG:teuthology.orchestra.run.vm03:> sudo mount -t xfs -o noatime /dev/vg_nvme/lv_1 /var/lib/ceph/osd/ceph-0 2026-03-31T20:21:19.070 DEBUG:teuthology.orchestra.run.vm03:> sudo /sbin/restorecon /var/lib/ceph/osd/ceph-0 2026-03-31T20:21:19.116 INFO:teuthology.orchestra.run.vm03.stderr:sudo: /sbin/restorecon: command not found 2026-03-31T20:21:19.116 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-31T20:21:19.117 DEBUG:teuthology.orchestra.run.vm03:> sudo mkdir -p /var/lib/ceph/osd/ceph-1 2026-03-31T20:21:19.166 INFO:tasks.ceph:roles_to_devs: {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'} 2026-03-31T20:21:19.166 INFO:tasks.ceph:role: osd.1 2026-03-31T20:21:19.166 INFO:tasks.ceph:['mkfs.xfs', '-f', '-i', 'size=2048'] on /dev/vg_nvme/lv_2 on ubuntu@vm03.local 2026-03-31T20:21:19.166 DEBUG:teuthology.orchestra.run.vm03:> yes | sudo mkfs.xfs -f -i size=2048 /dev/vg_nvme/lv_2 2026-03-31T20:21:19.213 INFO:teuthology.orchestra.run.vm03.stdout:meta-data=/dev/vg_nvme/lv_2 isize=2048 agcount=4, agsize=1310464 blks 2026-03-31T20:21:19.213 INFO:teuthology.orchestra.run.vm03.stdout: = sectsz=512 attr=2, projid32bit=1 2026-03-31T20:21:19.213 INFO:teuthology.orchestra.run.vm03.stdout: = crc=1 finobt=1, sparse=1, rmapbt=0 2026-03-31T20:21:19.213 INFO:teuthology.orchestra.run.vm03.stdout: = reflink=1 bigtime=0 inobtcount=0 2026-03-31T20:21:19.213 INFO:teuthology.orchestra.run.vm03.stdout:data = bsize=4096 blocks=5241856, imaxpct=25 2026-03-31T20:21:19.213 INFO:teuthology.orchestra.run.vm03.stdout: = sunit=0 swidth=0 blks 2026-03-31T20:21:19.213 INFO:teuthology.orchestra.run.vm03.stdout:naming =version 2 bsize=4096 ascii-ci=0, ftype=1 2026-03-31T20:21:19.213 INFO:teuthology.orchestra.run.vm03.stdout:log =internal log bsize=4096 blocks=2560, version=2 2026-03-31T20:21:19.213 INFO:teuthology.orchestra.run.vm03.stdout: = sectsz=512 sunit=0 blks, lazy-count=1 2026-03-31T20:21:19.213 INFO:teuthology.orchestra.run.vm03.stdout:realtime =none extsz=4096 blocks=0, rtextents=0 2026-03-31T20:21:19.218 INFO:teuthology.orchestra.run.vm03.stdout:Discarding blocks...Done. 2026-03-31T20:21:19.219 INFO:tasks.ceph:mount /dev/vg_nvme/lv_2 on ubuntu@vm03.local -o noatime 2026-03-31T20:21:19.219 DEBUG:teuthology.orchestra.run.vm03:> sudo mount -t xfs -o noatime /dev/vg_nvme/lv_2 /var/lib/ceph/osd/ceph-1 2026-03-31T20:21:19.274 DEBUG:teuthology.orchestra.run.vm03:> sudo /sbin/restorecon /var/lib/ceph/osd/ceph-1 2026-03-31T20:21:19.320 INFO:teuthology.orchestra.run.vm03.stderr:sudo: /sbin/restorecon: command not found 2026-03-31T20:21:19.320 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-31T20:21:19.320 DEBUG:teuthology.orchestra.run.vm03:> sudo mkdir -p /var/lib/ceph/osd/ceph-2 2026-03-31T20:21:19.370 INFO:tasks.ceph:roles_to_devs: {'osd.0': '/dev/vg_nvme/lv_1', 'osd.1': '/dev/vg_nvme/lv_2', 'osd.2': '/dev/vg_nvme/lv_3'} 2026-03-31T20:21:19.370 INFO:tasks.ceph:role: osd.2 2026-03-31T20:21:19.370 INFO:tasks.ceph:['mkfs.xfs', '-f', '-i', 'size=2048'] on /dev/vg_nvme/lv_3 on ubuntu@vm03.local 2026-03-31T20:21:19.370 DEBUG:teuthology.orchestra.run.vm03:> yes | sudo mkfs.xfs -f -i size=2048 /dev/vg_nvme/lv_3 2026-03-31T20:21:19.418 INFO:teuthology.orchestra.run.vm03.stdout:meta-data=/dev/vg_nvme/lv_3 isize=2048 agcount=4, agsize=1310464 blks 2026-03-31T20:21:19.418 INFO:teuthology.orchestra.run.vm03.stdout: = sectsz=512 attr=2, projid32bit=1 2026-03-31T20:21:19.418 INFO:teuthology.orchestra.run.vm03.stdout: = crc=1 finobt=1, sparse=1, rmapbt=0 2026-03-31T20:21:19.418 INFO:teuthology.orchestra.run.vm03.stdout: = reflink=1 bigtime=0 inobtcount=0 2026-03-31T20:21:19.418 INFO:teuthology.orchestra.run.vm03.stdout:data = bsize=4096 blocks=5241856, imaxpct=25 2026-03-31T20:21:19.418 INFO:teuthology.orchestra.run.vm03.stdout: = sunit=0 swidth=0 blks 2026-03-31T20:21:19.418 INFO:teuthology.orchestra.run.vm03.stdout:naming =version 2 bsize=4096 ascii-ci=0, ftype=1 2026-03-31T20:21:19.418 INFO:teuthology.orchestra.run.vm03.stdout:log =internal log bsize=4096 blocks=2560, version=2 2026-03-31T20:21:19.418 INFO:teuthology.orchestra.run.vm03.stdout: = sectsz=512 sunit=0 blks, lazy-count=1 2026-03-31T20:21:19.418 INFO:teuthology.orchestra.run.vm03.stdout:realtime =none extsz=4096 blocks=0, rtextents=0 2026-03-31T20:21:19.422 INFO:teuthology.orchestra.run.vm03.stdout:Discarding blocks...Done. 2026-03-31T20:21:19.424 INFO:tasks.ceph:mount /dev/vg_nvme/lv_3 on ubuntu@vm03.local -o noatime 2026-03-31T20:21:19.424 DEBUG:teuthology.orchestra.run.vm03:> sudo mount -t xfs -o noatime /dev/vg_nvme/lv_3 /var/lib/ceph/osd/ceph-2 2026-03-31T20:21:19.481 DEBUG:teuthology.orchestra.run.vm03:> sudo /sbin/restorecon /var/lib/ceph/osd/ceph-2 2026-03-31T20:21:19.528 INFO:teuthology.orchestra.run.vm03.stderr:sudo: /sbin/restorecon: command not found 2026-03-31T20:21:19.528 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-31T20:21:19.528 DEBUG:teuthology.orchestra.run.vm03:> sudo MALLOC_CHECK_=3 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-osd --no-mon-config --cluster ceph --mkfs --mkkey -i 0 --monmap /home/ubuntu/cephtest/ceph.monmap 2026-03-31T20:21:19.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-31T20:21:19.584+0000 7f24912aea40 -1 auth: error reading file: /var/lib/ceph/osd/ceph-0/keyring: can't open /var/lib/ceph/osd/ceph-0/keyring: (2) No such file or directory 2026-03-31T20:21:19.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-31T20:21:19.584+0000 7f24912aea40 -1 created new key in keyring /var/lib/ceph/osd/ceph-0/keyring 2026-03-31T20:21:19.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-31T20:21:19.584+0000 7f24912aea40 -1 bdev(0x565459cad800 /var/lib/ceph/osd/ceph-0/block) open stat got: (1) Operation not permitted 2026-03-31T20:21:19.590 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-31T20:21:19.584+0000 7f24912aea40 -1 bluestore(/var/lib/ceph/osd/ceph-0) _read_fsid unparsable uuid 2026-03-31T20:21:20.462 DEBUG:teuthology.orchestra.run.vm03:> sudo chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-31T20:21:20.510 DEBUG:teuthology.orchestra.run.vm03:> sudo MALLOC_CHECK_=3 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-osd --no-mon-config --cluster ceph --mkfs --mkkey -i 1 --monmap /home/ubuntu/cephtest/ceph.monmap 2026-03-31T20:21:20.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-31T20:21:20.564+0000 7f2b05d4ba40 -1 auth: error reading file: /var/lib/ceph/osd/ceph-1/keyring: can't open /var/lib/ceph/osd/ceph-1/keyring: (2) No such file or directory 2026-03-31T20:21:20.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-31T20:21:20.564+0000 7f2b05d4ba40 -1 created new key in keyring /var/lib/ceph/osd/ceph-1/keyring 2026-03-31T20:21:20.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-31T20:21:20.564+0000 7f2b05d4ba40 -1 bdev(0x55e48d2e9800 /var/lib/ceph/osd/ceph-1/block) open stat got: (1) Operation not permitted 2026-03-31T20:21:20.570 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-31T20:21:20.564+0000 7f2b05d4ba40 -1 bluestore(/var/lib/ceph/osd/ceph-1) _read_fsid unparsable uuid 2026-03-31T20:21:21.417 DEBUG:teuthology.orchestra.run.vm03:> sudo chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-31T20:21:21.465 DEBUG:teuthology.orchestra.run.vm03:> sudo MALLOC_CHECK_=3 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-osd --no-mon-config --cluster ceph --mkfs --mkkey -i 2 --monmap /home/ubuntu/cephtest/ceph.monmap 2026-03-31T20:21:21.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-31T20:21:21.524+0000 7f4cfa7ffa40 -1 auth: error reading file: /var/lib/ceph/osd/ceph-2/keyring: can't open /var/lib/ceph/osd/ceph-2/keyring: (2) No such file or directory 2026-03-31T20:21:21.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-31T20:21:21.524+0000 7f4cfa7ffa40 -1 created new key in keyring /var/lib/ceph/osd/ceph-2/keyring 2026-03-31T20:21:21.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-31T20:21:21.524+0000 7f4cfa7ffa40 -1 bdev(0x556226515800 /var/lib/ceph/osd/ceph-2/block) open stat got: (1) Operation not permitted 2026-03-31T20:21:21.527 INFO:teuthology.orchestra.run.vm03.stderr:2026-03-31T20:21:21.524+0000 7f4cfa7ffa40 -1 bluestore(/var/lib/ceph/osd/ceph-2) _read_fsid unparsable uuid 2026-03-31T20:21:22.425 DEBUG:teuthology.orchestra.run.vm03:> sudo chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-31T20:21:22.474 INFO:tasks.ceph:Reading keys from all nodes... 2026-03-31T20:21:22.474 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-31T20:21:22.474 DEBUG:teuthology.orchestra.run.vm03:> sudo dd if=/var/lib/ceph/mgr/ceph-x/keyring of=/dev/stdout 2026-03-31T20:21:22.521 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-31T20:21:22.521 DEBUG:teuthology.orchestra.run.vm03:> sudo dd if=/var/lib/ceph/osd/ceph-0/keyring of=/dev/stdout 2026-03-31T20:21:22.569 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-31T20:21:22.569 DEBUG:teuthology.orchestra.run.vm03:> sudo dd if=/var/lib/ceph/osd/ceph-1/keyring of=/dev/stdout 2026-03-31T20:21:22.617 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-31T20:21:22.617 DEBUG:teuthology.orchestra.run.vm03:> sudo dd if=/var/lib/ceph/osd/ceph-2/keyring of=/dev/stdout 2026-03-31T20:21:22.669 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-31T20:21:22.669 DEBUG:teuthology.orchestra.run.vm03:> dd if=/etc/ceph/ceph.client.0.keyring of=/dev/stdout 2026-03-31T20:21:22.713 INFO:tasks.ceph:Adding keys to all mons... 2026-03-31T20:21:22.713 DEBUG:teuthology.orchestra.run.vm03:> sudo tee -a /etc/ceph/ceph.keyring 2026-03-31T20:21:22.761 INFO:teuthology.orchestra.run.vm03.stdout:[mgr.x] 2026-03-31T20:21:22.761 INFO:teuthology.orchestra.run.vm03.stdout: key = AQC+LMxpopCpLxAARdBofy8A8Mrzze92vD0opg== 2026-03-31T20:21:22.761 INFO:teuthology.orchestra.run.vm03.stdout:[osd.0] 2026-03-31T20:21:22.761 INFO:teuthology.orchestra.run.vm03.stdout: key = AQC/LMxppFQZIxAAvxt8xEb3Mmpb8s5fgCIo/g== 2026-03-31T20:21:22.761 INFO:teuthology.orchestra.run.vm03.stdout:[osd.1] 2026-03-31T20:21:22.761 INFO:teuthology.orchestra.run.vm03.stdout: key = AQDALMxpcV3jIRAARQlDPsae5Lemc/N64DANTA== 2026-03-31T20:21:22.761 INFO:teuthology.orchestra.run.vm03.stdout:[osd.2] 2026-03-31T20:21:22.761 INFO:teuthology.orchestra.run.vm03.stdout: key = AQDBLMxpNOpSHxAAkKNTOz94jfGqrCWsY6plTw== 2026-03-31T20:21:22.761 INFO:teuthology.orchestra.run.vm03.stdout:[client.0] 2026-03-31T20:21:22.761 INFO:teuthology.orchestra.run.vm03.stdout: key = AQC+LMxpiW+EMxAAwuZj5IVhvRvsW21mfP6+CA== 2026-03-31T20:21:22.762 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool /etc/ceph/ceph.keyring --name=mgr.x --cap mon 'allow profile mgr' --cap osd 'allow *' --cap mds 'allow *' 2026-03-31T20:21:22.823 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool /etc/ceph/ceph.keyring --name=osd.0 --cap mon 'allow profile osd' --cap mgr 'allow profile osd' --cap osd 'allow *' 2026-03-31T20:21:22.887 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool /etc/ceph/ceph.keyring --name=osd.1 --cap mon 'allow profile osd' --cap mgr 'allow profile osd' --cap osd 'allow *' 2026-03-31T20:21:22.953 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool /etc/ceph/ceph.keyring --name=osd.2 --cap mon 'allow profile osd' --cap mgr 'allow profile osd' --cap osd 'allow *' 2026-03-31T20:21:23.017 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-authtool /etc/ceph/ceph.keyring --name=client.0 --cap mon 'allow rw' --cap mgr 'allow r' --cap osd 'allow rwx' --cap mds allow 2026-03-31T20:21:23.081 INFO:tasks.ceph:Running mkfs on mon nodes... 2026-03-31T20:21:23.081 DEBUG:teuthology.orchestra.run.vm03:> sudo mkdir -p /var/lib/ceph/mon/ceph-a 2026-03-31T20:21:23.130 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-mon --cluster ceph --mkfs -i a --monmap /home/ubuntu/cephtest/ceph.monmap --keyring /etc/ceph/ceph.keyring 2026-03-31T20:21:23.207 DEBUG:teuthology.orchestra.run.vm03:> sudo chown -R ceph:ceph /var/lib/ceph/mon/ceph-a 2026-03-31T20:21:23.259 DEBUG:teuthology.orchestra.run.vm03:> sudo mkdir -p /var/lib/ceph/mon/ceph-b 2026-03-31T20:21:23.309 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-mon --cluster ceph --mkfs -i b --monmap /home/ubuntu/cephtest/ceph.monmap --keyring /etc/ceph/ceph.keyring 2026-03-31T20:21:23.386 DEBUG:teuthology.orchestra.run.vm03:> sudo chown -R ceph:ceph /var/lib/ceph/mon/ceph-b 2026-03-31T20:21:23.434 DEBUG:teuthology.orchestra.run.vm03:> sudo mkdir -p /var/lib/ceph/mon/ceph-c 2026-03-31T20:21:23.481 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph-mon --cluster ceph --mkfs -i c --monmap /home/ubuntu/cephtest/ceph.monmap --keyring /etc/ceph/ceph.keyring 2026-03-31T20:21:23.558 DEBUG:teuthology.orchestra.run.vm03:> sudo chown -R ceph:ceph /var/lib/ceph/mon/ceph-c 2026-03-31T20:21:23.606 DEBUG:teuthology.orchestra.run.vm03:> rm -- /home/ubuntu/cephtest/ceph.monmap 2026-03-31T20:21:23.653 INFO:tasks.ceph:Starting mon daemons in cluster ceph... 2026-03-31T20:21:23.653 INFO:tasks.ceph.mon.a:Restarting daemon 2026-03-31T20:21:23.653 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-mon -f --cluster ceph -i a 2026-03-31T20:21:23.695 INFO:tasks.ceph.mon.a:Started 2026-03-31T20:21:23.695 INFO:tasks.ceph.mon.b:Restarting daemon 2026-03-31T20:21:23.695 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-mon -f --cluster ceph -i b 2026-03-31T20:21:23.696 INFO:tasks.ceph.mon.b:Started 2026-03-31T20:21:23.696 INFO:tasks.ceph.mon.c:Restarting daemon 2026-03-31T20:21:23.696 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-mon -f --cluster ceph -i c 2026-03-31T20:21:23.696 INFO:tasks.ceph.mon.c:Started 2026-03-31T20:21:23.697 INFO:tasks.ceph:Starting mgr daemons in cluster ceph... 2026-03-31T20:21:23.697 INFO:tasks.ceph.mgr.x:Restarting daemon 2026-03-31T20:21:23.697 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-mgr -f --cluster ceph -i x 2026-03-31T20:21:23.697 INFO:tasks.ceph.mgr.x:Started 2026-03-31T20:21:23.697 DEBUG:tasks.ceph:set 0 configs 2026-03-31T20:21:23.697 DEBUG:teuthology.orchestra.run.vm03:> sudo ceph --cluster ceph config dump 2026-03-31T20:21:24.009 INFO:teuthology.orchestra.run.vm03.stdout:WHO MASK LEVEL OPTION VALUE RO 2026-03-31T20:21:24.019 INFO:tasks.ceph:Setting crush tunables to default 2026-03-31T20:21:24.020 DEBUG:teuthology.orchestra.run.vm03:> sudo ceph --cluster ceph osd crush tunables default 2026-03-31T20:21:24.122 INFO:teuthology.orchestra.run.vm03.stderr:adjusted tunables profile to default 2026-03-31T20:21:24.134 INFO:tasks.ceph:check_enable_crimson: False 2026-03-31T20:21:24.134 INFO:tasks.ceph:Starting osd daemons in cluster ceph... 2026-03-31T20:21:24.134 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-31T20:21:24.134 DEBUG:teuthology.orchestra.run.vm03:> sudo dd if=/var/lib/ceph/osd/ceph-0/fsid of=/dev/stdout 2026-03-31T20:21:24.141 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-31T20:21:24.141 DEBUG:teuthology.orchestra.run.vm03:> sudo dd if=/var/lib/ceph/osd/ceph-1/fsid of=/dev/stdout 2026-03-31T20:21:24.190 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-31T20:21:24.190 DEBUG:teuthology.orchestra.run.vm03:> sudo dd if=/var/lib/ceph/osd/ceph-2/fsid of=/dev/stdout 2026-03-31T20:21:24.239 DEBUG:teuthology.orchestra.run.vm03:> sudo ceph --cluster ceph osd new f3bff22c-a21c-470a-aec1-c158b49523b8 0 2026-03-31T20:21:24.392 INFO:teuthology.orchestra.run.vm03.stdout:0 2026-03-31T20:21:24.404 DEBUG:teuthology.orchestra.run.vm03:> sudo ceph --cluster ceph osd new ea32c3bc-b054-4d75-ba06-731b384d7a6a 1 2026-03-31T20:21:24.518 INFO:teuthology.orchestra.run.vm03.stdout:1 2026-03-31T20:21:24.530 DEBUG:teuthology.orchestra.run.vm03:> sudo ceph --cluster ceph osd new b8c0450a-00e1-4ca8-b0ac-3f046a80f1b8 2 2026-03-31T20:21:24.637 INFO:teuthology.orchestra.run.vm03.stdout:2 2026-03-31T20:21:24.649 INFO:tasks.ceph.osd.0:Restarting daemon 2026-03-31T20:21:24.650 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 0 2026-03-31T20:21:24.650 INFO:tasks.ceph.osd.0:Started 2026-03-31T20:21:24.650 INFO:tasks.ceph.osd.1:Restarting daemon 2026-03-31T20:21:24.651 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 1 2026-03-31T20:21:24.651 INFO:tasks.ceph.osd.1:Started 2026-03-31T20:21:24.651 INFO:tasks.ceph.osd.2:Restarting daemon 2026-03-31T20:21:24.651 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-osd -f --cluster ceph -i 2 2026-03-31T20:21:24.652 INFO:tasks.ceph.osd.2:Started 2026-03-31T20:21:24.652 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd dump --format=json 2026-03-31T20:21:24.793 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:24.793 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":5,"fsid":"a4a0ca01-ae82-443e-a7c7-50605716689a","created":"2026-03-31T20:21:23.960433+0000","modified":"2026-03-31T20:21:24.632049+0000","last_up_change":"0.000000","last_in_change":"2026-03-31T20:21:24.632049+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":2,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":0,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[],"osds":[{"osd":0,"uuid":"f3bff22c-a21c-470a-aec1-c158b49523b8","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]},{"osd":1,"uuid":"ea32c3bc-b054-4d75-ba06-731b384d7a6a","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]},{"osd":2,"uuid":"b8c0450a-00e1-4ca8-b0ac-3f046a80f1b8","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-31T20:21:24.805 INFO:tasks.ceph.ceph_manager.ceph:[] 2026-03-31T20:21:24.805 INFO:tasks.ceph:Waiting for OSDs to come up 2026-03-31T20:21:24.959 INFO:tasks.ceph.osd.1.vm03.stderr:2026-03-31T20:21:24.956+0000 7ff87561da40 -1 Falling back to public interface 2026-03-31T20:21:25.007 INFO:tasks.ceph.osd.0.vm03.stderr:2026-03-31T20:21:25.004+0000 7fe3b721ea40 -1 Falling back to public interface 2026-03-31T20:21:25.023 INFO:tasks.ceph.osd.2.vm03.stderr:2026-03-31T20:21:25.020+0000 7fe002929a40 -1 Falling back to public interface 2026-03-31T20:21:25.107 DEBUG:teuthology.orchestra.run.vm03:> adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph --cluster ceph osd dump --format=json 2026-03-31T20:21:25.207 INFO:teuthology.misc.health.vm03.stdout: 2026-03-31T20:21:25.208 INFO:teuthology.misc.health.vm03.stdout:{"epoch":5,"fsid":"a4a0ca01-ae82-443e-a7c7-50605716689a","created":"2026-03-31T20:21:23.960433+0000","modified":"2026-03-31T20:21:24.632049+0000","last_up_change":"0.000000","last_in_change":"2026-03-31T20:21:24.632049+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":2,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":0,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[],"osds":[{"osd":0,"uuid":"f3bff22c-a21c-470a-aec1-c158b49523b8","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]},{"osd":1,"uuid":"ea32c3bc-b054-4d75-ba06-731b384d7a6a","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]},{"osd":2,"uuid":"b8c0450a-00e1-4ca8-b0ac-3f046a80f1b8","up":0,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":0,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[]},"cluster_addrs":{"addrvec":[]},"heartbeat_back_addrs":{"addrvec":[]},"heartbeat_front_addrs":{"addrvec":[]},"public_addr":"(unrecognized address family 0)/0","cluster_addr":"(unrecognized address family 0)/0","heartbeat_back_addr":"(unrecognized address family 0)/0","heartbeat_front_addr":"(unrecognized address family 0)/0","state":["exists","new"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":0,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-31T20:21:25.221 DEBUG:teuthology.misc:0 of 3 OSDs are up 2026-03-31T20:21:25.463 INFO:tasks.ceph.mgr.x.vm03.stderr:/usr/lib/python3/dist-packages/scipy/__init__.py:67: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode. 2026-03-31T20:21:25.463 INFO:tasks.ceph.mgr.x.vm03.stderr:Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve. 2026-03-31T20:21:25.463 INFO:tasks.ceph.mgr.x.vm03.stderr: from numpy import show_config as show_numpy_config 2026-03-31T20:21:25.581 INFO:tasks.ceph.osd.1.vm03.stderr:2026-03-31T20:21:25.576+0000 7ff87561da40 -1 osd.1 0 log_to_monitors true 2026-03-31T20:21:25.631 INFO:tasks.ceph.osd.0.vm03.stderr:2026-03-31T20:21:25.628+0000 7fe3b721ea40 -1 osd.0 0 log_to_monitors true 2026-03-31T20:21:25.669 INFO:tasks.ceph.osd.2.vm03.stderr:2026-03-31T20:21:25.664+0000 7fe002929a40 -1 osd.2 0 log_to_monitors true 2026-03-31T20:21:26.061 INFO:tasks.ceph.mgr.x.vm03.stderr:Failed to import NVMeoFClient and related components: cannot import name 'NVMeoFClient' from 'dashboard.services.nvmeof_client' (/usr/share/ceph/mgr/dashboard/services/nvmeof_client.py) 2026-03-31T20:21:26.987 INFO:tasks.ceph.osd.0.vm03.stderr:2026-03-31T20:21:26.984+0000 7fe3b29a4640 -1 osd.0 0 waiting for initial osdmap 2026-03-31T20:21:26.992 INFO:tasks.ceph.osd.0.vm03.stderr:2026-03-31T20:21:26.988+0000 7fe3ae792640 -1 osd.0 7 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-31T20:21:26.993 INFO:tasks.ceph.osd.2.vm03.stderr:2026-03-31T20:21:26.988+0000 7fdffe0af640 -1 osd.2 0 waiting for initial osdmap 2026-03-31T20:21:26.994 INFO:tasks.ceph.osd.1.vm03.stderr:2026-03-31T20:21:26.988+0000 7ff871dd8640 -1 osd.1 0 waiting for initial osdmap 2026-03-31T20:21:26.997 INFO:tasks.ceph.osd.1.vm03.stderr:2026-03-31T20:21:26.992+0000 7ff86c390640 -1 osd.1 7 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-31T20:21:26.998 INFO:tasks.ceph.osd.2.vm03.stderr:2026-03-31T20:21:26.992+0000 7fdff969c640 -1 osd.2 7 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-31T20:21:27.534 INFO:tasks.ceph.mgr.x.vm03.stderr:2026-03-31T20:21:27.528+0000 7f6f57b35640 -1 mgr.server handle_report got status from non-daemon mon.c 2026-03-31T20:21:27.536 INFO:tasks.ceph.mgr.x.vm03.stderr:2026-03-31T20:21:27.532+0000 7f6f57b35640 -1 mgr.server handle_report got status from non-daemon mon.b 2026-03-31T20:21:31.523 DEBUG:teuthology.orchestra.run.vm03:> adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage ceph --cluster ceph osd dump --format=json 2026-03-31T20:21:31.676 INFO:teuthology.misc.health.vm03.stdout: 2026-03-31T20:21:31.676 INFO:teuthology.misc.health.vm03.stdout:{"epoch":11,"fsid":"a4a0ca01-ae82-443e-a7c7-50605716689a","created":"2026-03-31T20:21:23.960433+0000","modified":"2026-03-31T20:21:31.013580+0000","last_up_change":"2026-03-31T20:21:27.988058+0000","last_in_change":"2026-03-31T20:21:24.632049+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":4,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-31T20:21:28.536032+0000","flags":1,"flags_names":"hashpspool","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"11","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_type":"Fair distribution","score_acting":2.9900000095367432,"score_stable":2.9900000095367432,"optimal_score":0.67000001668930054,"raw_score_acting":2,"raw_score_stable":2,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"f3bff22c-a21c-470a-aec1-c158b49523b8","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6804","nonce":950776786}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6805","nonce":950776786}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6807","nonce":950776786}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6806","nonce":950776786}]},"public_addr":"192.168.123.103:6804/950776786","cluster_addr":"192.168.123.103:6805/950776786","heartbeat_back_addr":"192.168.123.103:6807/950776786","heartbeat_front_addr":"192.168.123.103:6806/950776786","state":["exists","up"]},{"osd":1,"uuid":"ea32c3bc-b054-4d75-ba06-731b384d7a6a","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":9,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6800","nonce":3578452477}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6801","nonce":3578452477}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6803","nonce":3578452477}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6802","nonce":3578452477}]},"public_addr":"192.168.123.103:6800/3578452477","cluster_addr":"192.168.123.103:6801/3578452477","heartbeat_back_addr":"192.168.123.103:6803/3578452477","heartbeat_front_addr":"192.168.123.103:6802/3578452477","state":["exists","up"]},{"osd":2,"uuid":"b8c0450a-00e1-4ca8-b0ac-3f046a80f1b8","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6808","nonce":1523795840}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6809","nonce":1523795840}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6811","nonce":1523795840}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6810","nonce":1523795840}]},"public_addr":"192.168.123.103:6808/1523795840","cluster_addr":"192.168.123.103:6809/1523795840","heartbeat_back_addr":"192.168.123.103:6811/1523795840","heartbeat_front_addr":"192.168.123.103:6810/1523795840","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4541880224203014143,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4541880224203014143,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4541880224203014143,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-31T20:21:31.690 DEBUG:teuthology.misc:3 of 3 OSDs are up 2026-03-31T20:21:31.690 INFO:tasks.ceph:Creating RBD pool 2026-03-31T20:21:31.690 DEBUG:teuthology.orchestra.run.vm03:> sudo ceph --cluster ceph osd pool create rbd 8 2026-03-31T20:21:32.035 INFO:teuthology.orchestra.run.vm03.stderr:pool 'rbd' created 2026-03-31T20:21:32.049 DEBUG:teuthology.orchestra.run.vm03:> rbd --cluster ceph pool init rbd 2026-03-31T20:21:35.049 INFO:tasks.ceph:Starting mds daemons in cluster ceph... 2026-03-31T20:21:35.049 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph config log 1 --format=json 2026-03-31T20:21:35.049 INFO:tasks.daemonwatchdog.daemon_watchdog:watchdog starting 2026-03-31T20:21:35.219 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:35.234 INFO:teuthology.orchestra.run.vm03.stdout:[{"version":1,"timestamp":"0.000000","name":"","changes":[]}] 2026-03-31T20:21:35.234 INFO:tasks.ceph_manager:config epoch is 1 2026-03-31T20:21:35.234 INFO:tasks.ceph:Waiting until ceph daemons up and pgs clean... 2026-03-31T20:21:35.234 INFO:tasks.ceph.ceph_manager.ceph:waiting for mgr available 2026-03-31T20:21:35.234 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph mgr dump --format=json 2026-03-31T20:21:35.421 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:35.435 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":5,"flags":0,"active_gid":4119,"active_name":"x","active_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6812","nonce":293048097}]},"active_addr":"192.168.123.103:6812/293048097","active_change":"2026-03-31T20:21:26.514949+0000","active_mgr_features":4541880224203014143,"available":true,"standbys":[],"modules":["iostat","nfs"],"available_modules":[{"name":"alerts","can_run":true,"error_string":"","module_options":{"interval":{"name":"interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"How frequently to reexamine health status","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"smtp_destination":{"name":"smtp_destination","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Email address to send alerts to, use commas to separate multiple","long_desc":"","tags":[],"see_also":[]},"smtp_from_name":{"name":"smtp_from_name","type":"str","level":"advanced","flags":1,"default_value":"Ceph","min":"","max":"","enum_allowed":[],"desc":"Email From: name","long_desc":"","tags":[],"see_also":[]},"smtp_host":{"name":"smtp_host","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_password":{"name":"smtp_password","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Password to authenticate with","long_desc":"","tags":[],"see_also":[]},"smtp_port":{"name":"smtp_port","type":"int","level":"advanced","flags":1,"default_value":"465","min":"","max":"","enum_allowed":[],"desc":"SMTP port","long_desc":"","tags":[],"see_also":[]},"smtp_sender":{"name":"smtp_sender","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP envelope sender","long_desc":"","tags":[],"see_also":[]},"smtp_ssl":{"name":"smtp_ssl","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Use SSL to connect to SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_user":{"name":"smtp_user","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"User to authenticate as","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"balancer","can_run":true,"error_string":"","module_options":{"active":{"name":"active","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"automatically balance PGs across cluster","long_desc":"","tags":[],"see_also":[]},"begin_time":{"name":"begin_time","type":"str","level":"advanced","flags":1,"default_value":"0000","min":"","max":"","enum_allowed":[],"desc":"beginning time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"begin_weekday":{"name":"begin_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to this day of the week or later","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"crush_compat_max_iterations":{"name":"crush_compat_max_iterations","type":"uint","level":"advanced","flags":1,"default_value":"25","min":"1","max":"250","enum_allowed":[],"desc":"maximum number of iterations to attempt optimization","long_desc":"","tags":[],"see_also":[]},"crush_compat_metrics":{"name":"crush_compat_metrics","type":"str","level":"advanced","flags":1,"default_value":"pgs,objects,bytes","min":"","max":"","enum_allowed":[],"desc":"metrics with which to calculate OSD utilization","long_desc":"Value is a list of one or more of \"pgs\", \"objects\", or \"bytes\", and indicates which metrics to use to balance utilization.","tags":[],"see_also":[]},"crush_compat_step":{"name":"crush_compat_step","type":"float","level":"advanced","flags":1,"default_value":"0.5","min":"0.001","max":"0.999","enum_allowed":[],"desc":"aggressiveness of optimization","long_desc":".99 is very aggressive, .01 is less aggressive","tags":[],"see_also":[]},"end_time":{"name":"end_time","type":"str","level":"advanced","flags":1,"default_value":"2359","min":"","max":"","enum_allowed":[],"desc":"ending time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"end_weekday":{"name":"end_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to days of the week earlier than this","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_score":{"name":"min_score","type":"float","level":"advanced","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"minimum score, below which no optimization is attempted","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":1,"default_value":"upmap","min":"","max":"","enum_allowed":["crush-compat","none","read","upmap","upmap-read"],"desc":"Balancer mode","long_desc":"","tags":[],"see_also":[]},"pool_ids":{"name":"pool_ids","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"pools which the automatic balancing will be limited to","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and attempt optimization","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"update_pg_upmap_activity":{"name":"update_pg_upmap_activity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Updates pg_upmap activity stats to be used in `balancer status detail`","long_desc":"","tags":[],"see_also":[]},"upmap_max_deviation":{"name":"upmap_max_deviation","type":"int","level":"advanced","flags":1,"default_value":"5","min":"1","max":"","enum_allowed":[],"desc":"deviation below which no optimization is attempted","long_desc":"If the number of PGs are within this count then no optimization is attempted","tags":[],"see_also":[]},"upmap_max_optimizations":{"name":"upmap_max_optimizations","type":"uint","level":"advanced","flags":1,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"maximum upmap optimizations to make per attempt","long_desc":"","tags":[],"see_also":[]}}},{"name":"cephadm","can_run":true,"error_string":"","module_options":{"agent_down_multiplier":{"name":"agent_down_multiplier","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"","max":"","enum_allowed":[],"desc":"Multiplied by agent refresh rate to calculate how long agent must not report before being marked down","long_desc":"","tags":[],"see_also":[]},"agent_refresh_rate":{"name":"agent_refresh_rate","type":"secs","level":"advanced","flags":0,"default_value":"20","min":"","max":"","enum_allowed":[],"desc":"How often agent on each host will try to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"agent_starting_port":{"name":"agent_starting_port","type":"int","level":"advanced","flags":0,"default_value":"4721","min":"","max":"","enum_allowed":[],"desc":"First port agent will try to bind to (will also try up to next 1000 subsequent ports if blocked)","long_desc":"","tags":[],"see_also":[]},"allow_ptrace":{"name":"allow_ptrace","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow SYS_PTRACE capability on ceph containers","long_desc":"The SYS_PTRACE capability is needed to attach to a process with gdb or strace. Enabling this options can allow debugging daemons that encounter problems at runtime.","tags":[],"see_also":[]},"autotune_interval":{"name":"autotune_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to autotune daemon memory","long_desc":"","tags":[],"see_also":[]},"autotune_memory_target_ratio":{"name":"autotune_memory_target_ratio","type":"float","level":"advanced","flags":0,"default_value":"0.7","min":"","max":"","enum_allowed":[],"desc":"ratio of total system memory to divide amongst autotuned daemons","long_desc":"","tags":[],"see_also":[]},"cephadm_log_destination":{"name":"cephadm_log_destination","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":["file","file,syslog","syslog"],"desc":"Destination for cephadm command's persistent logging","long_desc":"","tags":[],"see_also":[]},"certificate_automated_rotation_enabled":{"name":"certificate_automated_rotation_enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"This flag controls whether cephadm automatically rotates certificates upon expiration.","long_desc":"","tags":[],"see_also":[]},"certificate_check_debug_mode":{"name":"certificate_check_debug_mode","type":"bool","level":"dev","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"FOR TESTING ONLY: This flag forces the certificate check instead of waiting for certificate_check_period.","long_desc":"","tags":[],"see_also":[]},"certificate_check_period":{"name":"certificate_check_period","type":"int","level":"advanced","flags":0,"default_value":"1","min":"0","max":"30","enum_allowed":[],"desc":"Specifies how often (in days) the certificate should be checked for validity.","long_desc":"","tags":[],"see_also":[]},"certificate_duration_days":{"name":"certificate_duration_days","type":"int","level":"advanced","flags":0,"default_value":"1095","min":"90","max":"3650","enum_allowed":[],"desc":"Specifies the duration of self certificates generated and signed by cephadm root CA","long_desc":"","tags":[],"see_also":[]},"certificate_renewal_threshold_days":{"name":"certificate_renewal_threshold_days","type":"int","level":"advanced","flags":0,"default_value":"30","min":"10","max":"90","enum_allowed":[],"desc":"Specifies the lead time in days to initiate certificate renewal before expiration.","long_desc":"","tags":[],"see_also":[]},"cgroups_split":{"name":"cgroups_split","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Pass --cgroups=split when cephadm creates containers (currently podman only)","long_desc":"","tags":[],"see_also":[]},"config_checks_enabled":{"name":"config_checks_enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable or disable the cephadm configuration analysis","long_desc":"","tags":[],"see_also":[]},"config_dashboard":{"name":"config_dashboard","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"manage configs like API endpoints in Dashboard.","long_desc":"","tags":[],"see_also":[]},"container_image_alertmanager":{"name":"container_image_alertmanager","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/alertmanager:v0.28.1","min":"","max":"","enum_allowed":[],"desc":"Alertmanager container image","long_desc":"","tags":[],"see_also":[]},"container_image_base":{"name":"container_image_base","type":"str","level":"advanced","flags":1,"default_value":"quay.io/ceph/ceph","min":"","max":"","enum_allowed":[],"desc":"Container image name, without the tag","long_desc":"","tags":[],"see_also":[]},"container_image_elasticsearch":{"name":"container_image_elasticsearch","type":"str","level":"advanced","flags":0,"default_value":"quay.io/omrizeneva/elasticsearch:6.8.23","min":"","max":"","enum_allowed":[],"desc":"Elasticsearch container image","long_desc":"","tags":[],"see_also":[]},"container_image_grafana":{"name":"container_image_grafana","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/grafana:12.3.1","min":"","max":"","enum_allowed":[],"desc":"Grafana container image","long_desc":"","tags":[],"see_also":[]},"container_image_haproxy":{"name":"container_image_haproxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/haproxy:2.3","min":"","max":"","enum_allowed":[],"desc":"Haproxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_agent":{"name":"container_image_jaeger_agent","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-agent:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger agent container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_collector":{"name":"container_image_jaeger_collector","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-collector:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger collector container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_query":{"name":"container_image_jaeger_query","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-query:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger query container image","long_desc":"","tags":[],"see_also":[]},"container_image_keepalived":{"name":"container_image_keepalived","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/keepalived:2.2.4","min":"","max":"","enum_allowed":[],"desc":"Keepalived container image","long_desc":"","tags":[],"see_also":[]},"container_image_loki":{"name":"container_image_loki","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/loki:3.0.0","min":"","max":"","enum_allowed":[],"desc":"Loki container image","long_desc":"","tags":[],"see_also":[]},"container_image_nginx":{"name":"container_image_nginx","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/nginx:sclorg-nginx-126","min":"","max":"","enum_allowed":[],"desc":"Nginx container image","long_desc":"","tags":[],"see_also":[]},"container_image_node_exporter":{"name":"container_image_node_exporter","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/node-exporter:v1.9.1","min":"","max":"","enum_allowed":[],"desc":"Node exporter container image","long_desc":"","tags":[],"see_also":[]},"container_image_nvmeof":{"name":"container_image_nvmeof","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/nvmeof:1.5","min":"","max":"","enum_allowed":[],"desc":"Nvmeof container image","long_desc":"","tags":[],"see_also":[]},"container_image_oauth2_proxy":{"name":"container_image_oauth2_proxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/oauth2-proxy/oauth2-proxy:v7.6.0","min":"","max":"","enum_allowed":[],"desc":"Oauth2 proxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_prometheus":{"name":"container_image_prometheus","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/prometheus:v3.6.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_promtail":{"name":"container_image_promtail","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/promtail:3.0.0","min":"","max":"","enum_allowed":[],"desc":"Promtail container image","long_desc":"","tags":[],"see_also":[]},"container_image_samba":{"name":"container_image_samba","type":"str","level":"advanced","flags":0,"default_value":"quay.io/samba.org/samba-server:ceph20-centos-amd64","min":"","max":"","enum_allowed":[],"desc":"Samba container image","long_desc":"","tags":[],"see_also":[]},"container_image_samba_metrics":{"name":"container_image_samba_metrics","type":"str","level":"advanced","flags":0,"default_value":"quay.io/samba.org/samba-metrics:ceph20-centos-amd64","min":"","max":"","enum_allowed":[],"desc":"Samba metrics container image","long_desc":"","tags":[],"see_also":[]},"container_image_snmp_gateway":{"name":"container_image_snmp_gateway","type":"str","level":"advanced","flags":0,"default_value":"docker.io/maxwo/snmp-notifier:v1.2.1","min":"","max":"","enum_allowed":[],"desc":"Snmp gateway container image","long_desc":"","tags":[],"see_also":[]},"container_init":{"name":"container_init","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Run podman/docker with `--init`","long_desc":"","tags":[],"see_also":[]},"daemon_cache_timeout":{"name":"daemon_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"seconds to cache service (daemon) inventory","long_desc":"","tags":[],"see_also":[]},"default_cephadm_command_timeout":{"name":"default_cephadm_command_timeout","type":"int","level":"advanced","flags":0,"default_value":"900","min":"","max":"","enum_allowed":[],"desc":"Default timeout applied to cephadm commands run directly on the host (in seconds)","long_desc":"","tags":[],"see_also":[]},"default_registry":{"name":"default_registry","type":"str","level":"advanced","flags":0,"default_value":"quay.io","min":"","max":"","enum_allowed":[],"desc":"Search-registry to which we should normalize unqualified image names. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"device_cache_timeout":{"name":"device_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"seconds to cache device inventory","long_desc":"","tags":[],"see_also":[]},"device_enhanced_scan":{"name":"device_enhanced_scan","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use libstoragemgmt during device scans","long_desc":"","tags":[],"see_also":[]},"facts_cache_timeout":{"name":"facts_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"seconds to cache host facts data","long_desc":"","tags":[],"see_also":[]},"grafana_dashboards_path":{"name":"grafana_dashboards_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/grafana/dashboards/ceph-dashboard/","min":"","max":"","enum_allowed":[],"desc":"location of dashboards to include in grafana deployments","long_desc":"","tags":[],"see_also":[]},"host_check_interval":{"name":"host_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to perform a host check","long_desc":"","tags":[],"see_also":[]},"hw_monitoring":{"name":"hw_monitoring","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Deploy hw monitoring daemon on every host.","long_desc":"","tags":[],"see_also":[]},"inventory_list_all":{"name":"inventory_list_all","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Whether ceph-volume inventory should report more devices (mostly mappers (LVs / mpaths), partitions...)","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_refresh_metadata":{"name":"log_refresh_metadata","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Log all refresh metadata. Includes daemon, device, and host info collected regularly. Only has effect if logging at debug level","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"log to the \"cephadm\" cluster log channel\"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf":{"name":"manage_etc_ceph_ceph_conf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Manage and own /etc/ceph/ceph.conf on the hosts.","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf_hosts":{"name":"manage_etc_ceph_ceph_conf_hosts","type":"str","level":"advanced","flags":0,"default_value":"*","min":"","max":"","enum_allowed":[],"desc":"PlacementSpec describing on which hosts to manage /etc/ceph/ceph.conf","long_desc":"","tags":[],"see_also":[]},"max_count_per_host":{"name":"max_count_per_host","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of daemons per service per host","long_desc":"","tags":[],"see_also":[]},"max_osd_draining_count":{"name":"max_osd_draining_count","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of osds that will be drained simultaneously when osds are removed","long_desc":"","tags":[],"see_also":[]},"migration_current":{"name":"migration_current","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"internal - do not modify","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":0,"default_value":"root","min":"","max":"","enum_allowed":["cephadm-package","root"],"desc":"mode for remote execution of cephadm","long_desc":"","tags":[],"see_also":[]},"oob_default_addr":{"name":"oob_default_addr","type":"str","level":"advanced","flags":0,"default_value":"169.254.1.1","min":"","max":"","enum_allowed":[],"desc":"Default address for RedFish API (oob management).","long_desc":"","tags":[],"see_also":[]},"prometheus_alerts_path":{"name":"prometheus_alerts_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/prometheus/ceph/ceph_default_alerts.yml","min":"","max":"","enum_allowed":[],"desc":"location of alerts to include in prometheus deployments","long_desc":"","tags":[],"see_also":[]},"registry_insecure":{"name":"registry_insecure","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Registry is to be considered insecure (no TLS available). Only for development purposes.","long_desc":"","tags":[],"see_also":[]},"registry_password":{"name":"registry_password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository password. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"registry_url":{"name":"registry_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Registry url for login purposes. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"registry_username":{"name":"registry_username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository username. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"secure_monitoring_stack":{"name":"secure_monitoring_stack","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable TLS security for all the monitoring stack daemons","long_desc":"","tags":[],"see_also":[]},"service_discovery_port":{"name":"service_discovery_port","type":"int","level":"advanced","flags":0,"default_value":"8765","min":"","max":"","enum_allowed":[],"desc":"cephadm service discovery port","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssh_config_file":{"name":"ssh_config_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"customized SSH config file to connect to managed hosts","long_desc":"","tags":[],"see_also":[]},"ssh_keepalive_count_max":{"name":"ssh_keepalive_count_max","type":"int","level":"advanced","flags":0,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"How many times ssh connections can fail liveness checks before the host is marked offline","long_desc":"","tags":[],"see_also":[]},"ssh_keepalive_interval":{"name":"ssh_keepalive_interval","type":"int","level":"advanced","flags":0,"default_value":"7","min":"","max":"","enum_allowed":[],"desc":"How often ssh connections are checked for liveness","long_desc":"","tags":[],"see_also":[]},"stray_daemon_check_interval":{"name":"stray_daemon_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"how frequently cephadm should check for the presence of stray daemons","long_desc":"","tags":[],"see_also":[]},"use_agent":{"name":"use_agent","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use cephadm agent on each host to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"use_repo_digest":{"name":"use_repo_digest","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Automatically convert image tags to image digest. Make sure all daemons use the same image","long_desc":"","tags":[],"see_also":[]},"warn_on_failed_host_check":{"name":"warn_on_failed_host_check","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if the host check fails","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_daemons":{"name":"warn_on_stray_daemons","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected that are not managed by cephadm","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_hosts":{"name":"warn_on_stray_hosts","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected on a host that is not managed by cephadm","long_desc":"","tags":[],"see_also":[]}}},{"name":"crash","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"retain_interval":{"name":"retain_interval","type":"secs","level":"advanced","flags":1,"default_value":"31536000","min":"","max":"","enum_allowed":[],"desc":"how long to retain crashes before pruning them","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"warn_recent_interval":{"name":"warn_recent_interval","type":"secs","level":"advanced","flags":1,"default_value":"1209600","min":"","max":"","enum_allowed":[],"desc":"time interval in which to warn about recent crashes","long_desc":"","tags":[],"see_also":[]}}},{"name":"dashboard","can_run":true,"error_string":"","module_options":{"ACCOUNT_LOCKOUT_ATTEMPTS":{"name":"ACCOUNT_LOCKOUT_ATTEMPTS","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_HOST":{"name":"ALERTMANAGER_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_SSL_VERIFY":{"name":"ALERTMANAGER_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_ENABLED":{"name":"AUDIT_API_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_LOG_PAYLOAD":{"name":"AUDIT_API_LOG_PAYLOAD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ENABLE_BROWSABLE_API":{"name":"ENABLE_BROWSABLE_API","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_CEPHFS":{"name":"FEATURE_TOGGLE_CEPHFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_DASHBOARD":{"name":"FEATURE_TOGGLE_DASHBOARD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_ISCSI":{"name":"FEATURE_TOGGLE_ISCSI","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_MIRRORING":{"name":"FEATURE_TOGGLE_MIRRORING","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_NFS":{"name":"FEATURE_TOGGLE_NFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RBD":{"name":"FEATURE_TOGGLE_RBD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RGW":{"name":"FEATURE_TOGGLE_RGW","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE":{"name":"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_PASSWORD":{"name":"GRAFANA_API_PASSWORD","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_SSL_VERIFY":{"name":"GRAFANA_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_URL":{"name":"GRAFANA_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_USERNAME":{"name":"GRAFANA_API_USERNAME","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_FRONTEND_API_URL":{"name":"GRAFANA_FRONTEND_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_UPDATE_DASHBOARDS":{"name":"GRAFANA_UPDATE_DASHBOARDS","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISCSI_API_SSL_VERIFICATION":{"name":"ISCSI_API_SSL_VERIFICATION","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISSUE_TRACKER_API_KEY":{"name":"ISSUE_TRACKER_API_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"MANAGED_BY_CLUSTERS":{"name":"MANAGED_BY_CLUSTERS","type":"str","level":"advanced","flags":0,"default_value":"[]","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"MULTICLUSTER_CONFIG":{"name":"MULTICLUSTER_CONFIG","type":"str","level":"advanced","flags":0,"default_value":"{}","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_HOST":{"name":"PROMETHEUS_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_SSL_VERIFY":{"name":"PROMETHEUS_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROM_ALERT_CREDENTIAL_CACHE_TTL":{"name":"PROM_ALERT_CREDENTIAL_CACHE_TTL","type":"int","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_COMPLEXITY_ENABLED":{"name":"PWD_POLICY_CHECK_COMPLEXITY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED":{"name":"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_LENGTH_ENABLED":{"name":"PWD_POLICY_CHECK_LENGTH_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_OLDPWD_ENABLED":{"name":"PWD_POLICY_CHECK_OLDPWD_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_USERNAME_ENABLED":{"name":"PWD_POLICY_CHECK_USERNAME_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_ENABLED":{"name":"PWD_POLICY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_EXCLUSION_LIST":{"name":"PWD_POLICY_EXCLUSION_LIST","type":"str","level":"advanced","flags":0,"default_value":"osd,host,dashboard,pool,block,nfs,ceph,monitors,gateway,logs,crush,maps","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_COMPLEXITY":{"name":"PWD_POLICY_MIN_COMPLEXITY","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_LENGTH":{"name":"PWD_POLICY_MIN_LENGTH","type":"int","level":"advanced","flags":0,"default_value":"8","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"REST_REQUESTS_TIMEOUT":{"name":"REST_REQUESTS_TIMEOUT","type":"int","level":"advanced","flags":0,"default_value":"45","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ACCESS_KEY":{"name":"RGW_API_ACCESS_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ADMIN_RESOURCE":{"name":"RGW_API_ADMIN_RESOURCE","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SECRET_KEY":{"name":"RGW_API_SECRET_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SSL_VERIFY":{"name":"RGW_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_HOSTNAME_PER_DAEMON":{"name":"RGW_HOSTNAME_PER_DAEMON","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"UNSAFE_TLS_v1_2":{"name":"UNSAFE_TLS_v1_2","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_SPAN":{"name":"USER_PWD_EXPIRATION_SPAN","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_1":{"name":"USER_PWD_EXPIRATION_WARNING_1","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_2":{"name":"USER_PWD_EXPIRATION_WARNING_2","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"cross_origin_url":{"name":"cross_origin_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crt_file":{"name":"crt_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crypto_caller":{"name":"crypto_caller","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"debug":{"name":"debug","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable/disable debug options","long_desc":"","tags":[],"see_also":[]},"jwt_token_ttl":{"name":"jwt_token_ttl","type":"int","level":"advanced","flags":0,"default_value":"28800","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"motd":{"name":"motd","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"The message of the day","long_desc":"","tags":[],"see_also":[]},"redirect_resolve_ip_addr":{"name":"redirect_resolve_ip_addr","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":0,"default_value":"8080","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl_server_port":{"name":"ssl_server_port","type":"int","level":"advanced","flags":0,"default_value":"8443","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sso_oauth2":{"name":"sso_oauth2","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":0,"default_value":"redirect","min":"","max":"","enum_allowed":["error","redirect"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":0,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url_prefix":{"name":"url_prefix","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"devicehealth","can_run":true,"error_string":"","module_options":{"enable_monitoring":{"name":"enable_monitoring","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"monitor device health metrics","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mark_out_threshold":{"name":"mark_out_threshold","type":"secs","level":"advanced","flags":1,"default_value":"2419200","min":"","max":"","enum_allowed":[],"desc":"automatically mark OSD if it may fail before this long","long_desc":"","tags":[],"see_also":[]},"pool_name":{"name":"pool_name","type":"str","level":"advanced","flags":1,"default_value":"device_health_metrics","min":"","max":"","enum_allowed":[],"desc":"name of pool in which to store device health metrics","long_desc":"","tags":[],"see_also":[]},"retention_period":{"name":"retention_period","type":"secs","level":"advanced","flags":1,"default_value":"15552000","min":"","max":"","enum_allowed":[],"desc":"how long to retain device health metrics","long_desc":"","tags":[],"see_also":[]},"scrape_frequency":{"name":"scrape_frequency","type":"secs","level":"advanced","flags":1,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"how frequently to scrape device health metrics","long_desc":"","tags":[],"see_also":[]},"self_heal":{"name":"self_heal","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"preemptively heal cluster around devices that may fail","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and check device health","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"warn_threshold":{"name":"warn_threshold","type":"secs","level":"advanced","flags":1,"default_value":"7257600","min":"","max":"","enum_allowed":[],"desc":"raise health warning if OSD may fail before this long","long_desc":"","tags":[],"see_also":[]}}},{"name":"diskprediction_local","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predict_interval":{"name":"predict_interval","type":"str","level":"advanced","flags":0,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predictor_model":{"name":"predictor_model","type":"str","level":"advanced","flags":0,"default_value":"prophetstor","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"str","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"influx","can_run":false,"error_string":"influxdb python module not found","module_options":{"batch_size":{"name":"batch_size","type":"int","level":"advanced","flags":0,"default_value":"5000","min":"","max":"","enum_allowed":[],"desc":"How big batches of data points should be when sending to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"database":{"name":"database","type":"str","level":"advanced","flags":0,"default_value":"ceph","min":"","max":"","enum_allowed":[],"desc":"InfluxDB database name. You will need to create this database and grant write privileges to the configured username or the username must have admin privileges to create it.","long_desc":"","tags":[],"see_also":[]},"hostname":{"name":"hostname","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server hostname","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"30","min":"5","max":"","enum_allowed":[],"desc":"Time between reports to InfluxDB. Default 30 seconds.","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"password":{"name":"password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"password of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"port":{"name":"port","type":"int","level":"advanced","flags":0,"default_value":"8086","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server port","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"str","level":"advanced","flags":0,"default_value":"false","min":"","max":"","enum_allowed":[],"desc":"Use https connection for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]},"threads":{"name":"threads","type":"int","level":"advanced","flags":0,"default_value":"5","min":"1","max":"32","enum_allowed":[],"desc":"How many worker threads should be spawned for sending data to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"username":{"name":"username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"username of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"verify_ssl":{"name":"verify_ssl","type":"str","level":"advanced","flags":0,"default_value":"true","min":"","max":"","enum_allowed":[],"desc":"Verify https cert for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]}}},{"name":"insights","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"iostat","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"localpool","can_run":true,"error_string":"","module_options":{"failure_domain":{"name":"failure_domain","type":"str","level":"advanced","flags":1,"default_value":"host","min":"","max":"","enum_allowed":[],"desc":"failure domain for any created local pool","long_desc":"what failure domain we should separate data replicas across.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_size":{"name":"min_size","type":"int","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"default min_size for any created local pool","long_desc":"value to set min_size to (unchanged from Ceph's default if this option is not set)","tags":[],"see_also":[]},"num_rep":{"name":"num_rep","type":"int","level":"advanced","flags":1,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"default replica count for any created local pool","long_desc":"","tags":[],"see_also":[]},"pg_num":{"name":"pg_num","type":"int","level":"advanced","flags":1,"default_value":"128","min":"","max":"","enum_allowed":[],"desc":"default pg_num for any created local pool","long_desc":"","tags":[],"see_also":[]},"prefix":{"name":"prefix","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"name prefix for any created local pool","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"subtree":{"name":"subtree","type":"str","level":"advanced","flags":1,"default_value":"rack","min":"","max":"","enum_allowed":[],"desc":"CRUSH level for which to create a local pool","long_desc":"which CRUSH subtree type the module should create a pool for.","tags":[],"see_also":[]}}},{"name":"mirroring","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"nfs","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"orchestrator","can_run":true,"error_string":"","module_options":{"fail_fs":{"name":"fail_fs","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Fail filesystem for rapid multi-rank mds upgrade","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"orchestrator":{"name":"orchestrator","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["cephadm","rook","test_orchestrator"],"desc":"Orchestrator backend","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_perf_query","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"pg_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"threshold":{"name":"threshold","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"1.0","max":"","enum_allowed":[],"desc":"scaling threshold","long_desc":"The factor by which the `NEW PG_NUM` must vary from the current`PG_NUM` before being accepted. Cannot be less than 1.0","tags":[],"see_also":[]}}},{"name":"progress","can_run":true,"error_string":"","module_options":{"allow_pg_recovery_event":{"name":"allow_pg_recovery_event","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow the module to show pg recovery progress","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_completed_events":{"name":"max_completed_events","type":"int","level":"advanced","flags":1,"default_value":"50","min":"","max":"","enum_allowed":[],"desc":"number of past completed events to remember","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"how long the module is going to sleep","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"prometheus","can_run":true,"error_string":"","module_options":{"cache":{"name":"cache","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"exclude_perf_counters":{"name":"exclude_perf_counters","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Do not include perf-counters in the metrics output","long_desc":"Gathering perf-counters from a single Prometheus exporter can degrade ceph-mgr performance, especially in large clusters. Instead, Ceph-exporter daemons are now used by default for perf-counter gathering. This should only be disabled when no ceph-exporters are deployed.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools":{"name":"rbd_stats_pools","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools_refresh_interval":{"name":"rbd_stats_pools_refresh_interval","type":"int","level":"advanced","flags":0,"default_value":"300","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"scrape_interval":{"name":"scrape_interval","type":"float","level":"advanced","flags":0,"default_value":"15.0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"the IPv4 or IPv6 address on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":1,"default_value":"9283","min":"","max":"","enum_allowed":[],"desc":"the port on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"stale_cache_strategy":{"name":"stale_cache_strategy","type":"str","level":"advanced","flags":0,"default_value":"log","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":1,"default_value":"default","min":"","max":"","enum_allowed":["default","error"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":1,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rbd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_snap_create":{"name":"max_concurrent_snap_create","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mirror_snapshot_schedule":{"name":"mirror_snapshot_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"trash_purge_schedule":{"name":"trash_purge_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rgw","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"secondary_zone_period_retry_limit":{"name":"secondary_zone_period_retry_limit","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"RGW module period update retry limit for secondary site","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"selftest","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption1":{"name":"roption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption2":{"name":"roption2","type":"str","level":"advanced","flags":0,"default_value":"xyz","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption1":{"name":"rwoption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption2":{"name":"rwoption2","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption3":{"name":"rwoption3","type":"float","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption4":{"name":"rwoption4","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption5":{"name":"rwoption5","type":"bool","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption6":{"name":"rwoption6","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption7":{"name":"rwoption7","type":"int","level":"advanced","flags":0,"default_value":"","min":"1","max":"42","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testkey":{"name":"testkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testlkey":{"name":"testlkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testnewline":{"name":"testnewline","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"snap_schedule","can_run":true,"error_string":"","module_options":{"allow_m_granularity":{"name":"allow_m_granularity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow minute scheduled snapshots","long_desc":"","tags":[],"see_also":[]},"dump_on_update":{"name":"dump_on_update","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"dump database to debug log on update","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"stats","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"status","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telegraf","can_run":true,"error_string":"","module_options":{"address":{"name":"address","type":"str","level":"advanced","flags":0,"default_value":"unixgram:///tmp/telegraf.sock","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"15","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telemetry","can_run":true,"error_string":"","module_options":{"channel_basic":{"name":"channel_basic","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share basic cluster information (size, version)","long_desc":"","tags":[],"see_also":[]},"channel_crash":{"name":"channel_crash","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share metadata about Ceph daemon crashes (version, stack straces, etc)","long_desc":"","tags":[],"see_also":[]},"channel_device":{"name":"channel_device","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share device health metrics (e.g., SMART data, minus potentially identifying info like serial numbers)","long_desc":"","tags":[],"see_also":[]},"channel_ident":{"name":"channel_ident","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share a user-provided description and/or contact email for the cluster","long_desc":"","tags":[],"see_also":[]},"channel_perf":{"name":"channel_perf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share various performance metrics of a cluster","long_desc":"","tags":[],"see_also":[]},"contact":{"name":"contact","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"description":{"name":"description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"device_url":{"name":"device_url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/device","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"int","level":"advanced","flags":0,"default_value":"24","min":"8","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"last_opt_revision":{"name":"last_opt_revision","type":"int","level":"advanced","flags":0,"default_value":"1","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard":{"name":"leaderboard","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard_description":{"name":"leaderboard_description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"organization":{"name":"organization","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"proxy":{"name":"proxy","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url":{"name":"url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/report","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"test_orchestrator","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"volumes","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_clones":{"name":"max_concurrent_clones","type":"int","level":"advanced","flags":0,"default_value":"4","min":"","max":"","enum_allowed":[],"desc":"Number of asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"pause_cloning":{"name":"pause_cloning","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Pause asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"pause_purging":{"name":"pause_purging","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Pause asynchronous subvolume purge threads","long_desc":"","tags":[],"see_also":[]},"periodic_async_work":{"name":"periodic_async_work","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Periodically check for async work","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_delay":{"name":"snapshot_clone_delay","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"Delay clone begin operation by snapshot_clone_delay seconds","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_no_wait":{"name":"snapshot_clone_no_wait","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Reject subvolume clone request when cloner threads are busy","long_desc":"","tags":[],"see_also":[]},"sqlite3_killpoint":{"name":"sqlite3_killpoint","type":"int","level":"dev","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}}],"services":{},"always_on_modules":{"octopus":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"pacific":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"quincy":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"reef":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"squid":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"tentacle":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"]},"force_disabled_modules":{},"last_failure_osd_epoch":0,"active_clients":[{"name":"devicehealth","addrvec":[{"type":"v2","addr":"192.168.123.103:0","nonce":4057350736}]},{"name":"libcephsqlite","addrvec":[{"type":"v2","addr":"192.168.123.103:0","nonce":3884125092}]},{"name":"rbd_support","addrvec":[{"type":"v2","addr":"192.168.123.103:0","nonce":1536173869}]},{"name":"volumes","addrvec":[{"type":"v2","addr":"192.168.123.103:0","nonce":3367780566}]}]} 2026-03-31T20:21:35.435 INFO:tasks.ceph.ceph_manager.ceph:mgr available! 2026-03-31T20:21:35.435 INFO:tasks.ceph.ceph_manager.ceph:waiting for all up 2026-03-31T20:21:35.435 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd dump --format=json 2026-03-31T20:21:35.590 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:35.590 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":15,"fsid":"a4a0ca01-ae82-443e-a7c7-50605716689a","created":"2026-03-31T20:21:23.960433+0000","modified":"2026-03-31T20:21:35.031918+0000","last_up_change":"2026-03-31T20:21:27.988058+0000","last_in_change":"2026-03-31T20:21:24.632049+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":4,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":2,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-31T20:21:28.536032+0000","flags":1,"flags_names":"hashpspool","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"11","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_type":"Fair distribution","score_acting":2.9900000095367432,"score_stable":2.9900000095367432,"optimal_score":0.67000001668930054,"raw_score_acting":2,"raw_score_stable":2,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}},{"pool":2,"pool_name":"rbd","create_time":"2026-03-31T20:21:31.856309+0000","flags":8193,"flags_names":"hashpspool,selfmanaged_snaps","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":8,"pg_placement_num":8,"pg_placement_num_target":8,"pg_num_target":8,"pg_num_pending":8,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"15","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":2,"snap_epoch":15,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{},"application_metadata":{"rbd":{}},"read_balance":{"score_type":"Fair distribution","score_acting":1.8799999952316284,"score_stable":1.8799999952316284,"optimal_score":1,"raw_score_acting":1.8799999952316284,"raw_score_stable":1.8799999952316284,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"f3bff22c-a21c-470a-aec1-c158b49523b8","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6804","nonce":950776786}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6805","nonce":950776786}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6807","nonce":950776786}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6806","nonce":950776786}]},"public_addr":"192.168.123.103:6804/950776786","cluster_addr":"192.168.123.103:6805/950776786","heartbeat_back_addr":"192.168.123.103:6807/950776786","heartbeat_front_addr":"192.168.123.103:6806/950776786","state":["exists","up"]},{"osd":1,"uuid":"ea32c3bc-b054-4d75-ba06-731b384d7a6a","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6800","nonce":3578452477}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6801","nonce":3578452477}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6803","nonce":3578452477}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6802","nonce":3578452477}]},"public_addr":"192.168.123.103:6800/3578452477","cluster_addr":"192.168.123.103:6801/3578452477","heartbeat_back_addr":"192.168.123.103:6803/3578452477","heartbeat_front_addr":"192.168.123.103:6802/3578452477","state":["exists","up"]},{"osd":2,"uuid":"b8c0450a-00e1-4ca8-b0ac-3f046a80f1b8","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6808","nonce":1523795840}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6809","nonce":1523795840}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6811","nonce":1523795840}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6810","nonce":1523795840}]},"public_addr":"192.168.123.103:6808/1523795840","cluster_addr":"192.168.123.103:6809/1523795840","heartbeat_back_addr":"192.168.123.103:6811/1523795840","heartbeat_front_addr":"192.168.123.103:6810/1523795840","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4541880224203014143,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4541880224203014143,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4541880224203014143,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[{"pool":2,"snaps":[{"begin":2,"length":1}]}],"new_removed_snaps":[{"pool":2,"snaps":[{"begin":2,"length":1}]}],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-31T20:21:35.603 INFO:tasks.ceph.ceph_manager.ceph:all up! 2026-03-31T20:21:35.603 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd dump --format=json 2026-03-31T20:21:35.759 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:35.759 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":15,"fsid":"a4a0ca01-ae82-443e-a7c7-50605716689a","created":"2026-03-31T20:21:23.960433+0000","modified":"2026-03-31T20:21:35.031918+0000","last_up_change":"2026-03-31T20:21:27.988058+0000","last_in_change":"2026-03-31T20:21:24.632049+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":4,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":2,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"tentacle","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-31T20:21:28.536032+0000","flags":1,"flags_names":"hashpspool","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"11","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_type":"Fair distribution","score_acting":2.9900000095367432,"score_stable":2.9900000095367432,"optimal_score":0.67000001668930054,"raw_score_acting":2,"raw_score_stable":2,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}},{"pool":2,"pool_name":"rbd","create_time":"2026-03-31T20:21:31.856309+0000","flags":8193,"flags_names":"hashpspool,selfmanaged_snaps","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":8,"pg_placement_num":8,"pg_placement_num_target":8,"pg_num_target":8,"pg_num_pending":8,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"15","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":2,"snap_epoch":15,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{},"application_metadata":{"rbd":{}},"read_balance":{"score_type":"Fair distribution","score_acting":1.8799999952316284,"score_stable":1.8799999952316284,"optimal_score":1,"raw_score_acting":1.8799999952316284,"raw_score_stable":1.8799999952316284,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"f3bff22c-a21c-470a-aec1-c158b49523b8","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6804","nonce":950776786}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6805","nonce":950776786}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6807","nonce":950776786}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6806","nonce":950776786}]},"public_addr":"192.168.123.103:6804/950776786","cluster_addr":"192.168.123.103:6805/950776786","heartbeat_back_addr":"192.168.123.103:6807/950776786","heartbeat_front_addr":"192.168.123.103:6806/950776786","state":["exists","up"]},{"osd":1,"uuid":"ea32c3bc-b054-4d75-ba06-731b384d7a6a","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6800","nonce":3578452477}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6801","nonce":3578452477}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6803","nonce":3578452477}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6802","nonce":3578452477}]},"public_addr":"192.168.123.103:6800/3578452477","cluster_addr":"192.168.123.103:6801/3578452477","heartbeat_back_addr":"192.168.123.103:6803/3578452477","heartbeat_front_addr":"192.168.123.103:6802/3578452477","state":["exists","up"]},{"osd":2,"uuid":"b8c0450a-00e1-4ca8-b0ac-3f046a80f1b8","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":12,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6808","nonce":1523795840}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6809","nonce":1523795840}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6811","nonce":1523795840}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6810","nonce":1523795840}]},"public_addr":"192.168.123.103:6808/1523795840","cluster_addr":"192.168.123.103:6809/1523795840","heartbeat_back_addr":"192.168.123.103:6811/1523795840","heartbeat_front_addr":"192.168.123.103:6810/1523795840","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4541880224203014143,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4541880224203014143,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4541880224203014143,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[{"pool":2,"snaps":[{"begin":2,"length":1}]}],"new_removed_snaps":[{"pool":2,"snaps":[{"begin":2,"length":1}]}],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-31T20:21:35.772 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.0 flush_pg_stats 2026-03-31T20:21:35.772 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.1 flush_pg_stats 2026-03-31T20:21:35.772 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.2 flush_pg_stats 2026-03-31T20:21:35.869 INFO:teuthology.orchestra.run.vm03.stdout:34359738371 2026-03-31T20:21:35.869 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.2 2026-03-31T20:21:35.870 INFO:teuthology.orchestra.run.vm03.stdout:34359738371 2026-03-31T20:21:35.870 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.1 2026-03-31T20:21:35.877 INFO:teuthology.orchestra.run.vm03.stdout:34359738371 2026-03-31T20:21:35.877 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.0 2026-03-31T20:21:36.050 INFO:teuthology.orchestra.run.vm03.stdout:34359738370 2026-03-31T20:21:36.064 INFO:tasks.ceph.ceph_manager.ceph:need seq 34359738371 got 34359738370 for osd.2 2026-03-31T20:21:36.095 INFO:teuthology.orchestra.run.vm03.stdout:34359738370 2026-03-31T20:21:36.098 INFO:teuthology.orchestra.run.vm03.stdout:34359738370 2026-03-31T20:21:36.109 INFO:tasks.ceph.ceph_manager.ceph:need seq 34359738371 got 34359738370 for osd.0 2026-03-31T20:21:36.110 INFO:tasks.ceph.ceph_manager.ceph:need seq 34359738371 got 34359738370 for osd.1 2026-03-31T20:21:37.065 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.2 2026-03-31T20:21:37.109 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.0 2026-03-31T20:21:37.111 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd last-stat-seq osd.1 2026-03-31T20:21:37.261 INFO:teuthology.orchestra.run.vm03.stdout:34359738371 2026-03-31T20:21:37.275 INFO:tasks.ceph.ceph_manager.ceph:need seq 34359738371 got 34359738371 for osd.2 2026-03-31T20:21:37.275 DEBUG:teuthology.parallel:result is None 2026-03-31T20:21:37.285 INFO:teuthology.orchestra.run.vm03.stdout:34359738371 2026-03-31T20:21:37.291 INFO:teuthology.orchestra.run.vm03.stdout:34359738371 2026-03-31T20:21:37.299 INFO:tasks.ceph.ceph_manager.ceph:need seq 34359738371 got 34359738371 for osd.1 2026-03-31T20:21:37.300 DEBUG:teuthology.parallel:result is None 2026-03-31T20:21:37.305 INFO:tasks.ceph.ceph_manager.ceph:need seq 34359738371 got 34359738371 for osd.0 2026-03-31T20:21:37.305 DEBUG:teuthology.parallel:result is None 2026-03-31T20:21:37.305 INFO:tasks.ceph.ceph_manager.ceph:waiting for clean 2026-03-31T20:21:37.305 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-31T20:21:37.503 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:37.503 INFO:teuthology.orchestra.run.vm03.stderr:dumped all 2026-03-31T20:21:37.516 INFO:teuthology.orchestra.run.vm03.stdout:{"pg_ready":true,"pg_map":{"version":16,"stamp":"2026-03-31T20:21:36.524594+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459299,"num_objects":4,"num_object_clones":0,"num_object_copies":8,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":4,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":59,"num_write_kb":586,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":35,"ondisk_log_size":35,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":18,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":81352,"kb_used_data":856,"kb_used_omap":25,"kb_used_meta":80422,"kb_avail":283034168,"statfs":{"total":289910292480,"available":289826988032,"internally_reserved":0,"allocated":876544,"data_stored":1021615,"data_compressed":9608,"data_compressed_allocated":466944,"data_compressed_original":910368,"omap_allocated":26313,"internal_metadata":82352439},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":19,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"2.491587"},"pg_stats":[{"pgid":"2.7","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-31T20:21:35.038118+0000","last_change":"2026-03-31T20:21:35.038270+0000","last_active":"2026-03-31T20:21:35.038118+0000","last_peered":"2026-03-31T20:21:35.038118+0000","last_clean":"2026-03-31T20:21:35.038118+0000","last_became_active":"2026-03-31T20:21:33.038618+0000","last_became_peered":"2026-03-31T20:21:33.038618+0000","last_unstale":"2026-03-31T20:21:35.038118+0000","last_undegraded":"2026-03-31T20:21:35.038118+0000","last_fullsized":"2026-03-31T20:21:35.038118+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_clean_scrub_stamp":"2026-03-31T20:21:32.020592+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-01T21:37:49.889387+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00037397899999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.6","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-31T20:21:35.037699+0000","last_change":"2026-03-31T20:21:35.037923+0000","last_active":"2026-03-31T20:21:35.037699+0000","last_peered":"2026-03-31T20:21:35.037699+0000","last_clean":"2026-03-31T20:21:35.037699+0000","last_became_active":"2026-03-31T20:21:33.038527+0000","last_became_peered":"2026-03-31T20:21:33.038527+0000","last_unstale":"2026-03-31T20:21:35.037699+0000","last_undegraded":"2026-03-31T20:21:35.037699+0000","last_fullsized":"2026-03-31T20:21:35.037699+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_clean_scrub_stamp":"2026-03-31T20:21:32.020592+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-01T23:12:47.404171+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00038476,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.5","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-31T20:21:35.037945+0000","last_change":"2026-03-31T20:21:35.038056+0000","last_active":"2026-03-31T20:21:35.037945+0000","last_peered":"2026-03-31T20:21:35.037945+0000","last_clean":"2026-03-31T20:21:35.037945+0000","last_became_active":"2026-03-31T20:21:33.039071+0000","last_became_peered":"2026-03-31T20:21:33.039071+0000","last_unstale":"2026-03-31T20:21:35.037945+0000","last_undegraded":"2026-03-31T20:21:35.037945+0000","last_fullsized":"2026-03-31T20:21:35.037945+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_clean_scrub_stamp":"2026-03-31T20:21:32.020592+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T07:12:07.333081+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00042708900000000002,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.4","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-31T20:21:35.037778+0000","last_change":"2026-03-31T20:21:35.038061+0000","last_active":"2026-03-31T20:21:35.037778+0000","last_peered":"2026-03-31T20:21:35.037778+0000","last_clean":"2026-03-31T20:21:35.037778+0000","last_became_active":"2026-03-31T20:21:33.038780+0000","last_became_peered":"2026-03-31T20:21:33.038780+0000","last_unstale":"2026-03-31T20:21:35.037778+0000","last_undegraded":"2026-03-31T20:21:35.037778+0000","last_fullsized":"2026-03-31T20:21:35.037778+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_clean_scrub_stamp":"2026-03-31T20:21:32.020592+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T00:01:36.825157+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00049229099999999995,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.2","version":"15'2","reported_seq":22,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-31T20:21:35.037546+0000","last_change":"2026-03-31T20:21:35.037546+0000","last_active":"2026-03-31T20:21:35.037546+0000","last_peered":"2026-03-31T20:21:35.037546+0000","last_clean":"2026-03-31T20:21:35.037546+0000","last_became_active":"2026-03-31T20:21:33.036706+0000","last_became_peered":"2026-03-31T20:21:33.036706+0000","last_unstale":"2026-03-31T20:21:35.037546+0000","last_undegraded":"2026-03-31T20:21:35.037546+0000","last_fullsized":"2026-03-31T20:21:35.037546+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_clean_scrub_stamp":"2026-03-31T20:21:32.020592+0000","objects_scrubbed":0,"log_size":2,"log_dups_size":0,"ondisk_log_size":2,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T04:18:19.172037+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00026154900000000003,"stat_sum":{"num_bytes":19,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.1","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-31T20:21:35.041289+0000","last_change":"2026-03-31T20:21:35.041386+0000","last_active":"2026-03-31T20:21:35.041289+0000","last_peered":"2026-03-31T20:21:35.041289+0000","last_clean":"2026-03-31T20:21:35.041289+0000","last_became_active":"2026-03-31T20:21:33.036761+0000","last_became_peered":"2026-03-31T20:21:33.036761+0000","last_unstale":"2026-03-31T20:21:35.041289+0000","last_undegraded":"2026-03-31T20:21:35.041289+0000","last_fullsized":"2026-03-31T20:21:35.041289+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_clean_scrub_stamp":"2026-03-31T20:21:32.020592+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-01T22:00:11.952577+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00033548699999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.0","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-31T20:21:35.041303+0000","last_change":"2026-03-31T20:21:35.041462+0000","last_active":"2026-03-31T20:21:35.041303+0000","last_peered":"2026-03-31T20:21:35.041303+0000","last_clean":"2026-03-31T20:21:35.041303+0000","last_became_active":"2026-03-31T20:21:33.036655+0000","last_became_peered":"2026-03-31T20:21:33.036655+0000","last_unstale":"2026-03-31T20:21:35.041303+0000","last_undegraded":"2026-03-31T20:21:35.041303+0000","last_fullsized":"2026-03-31T20:21:35.041303+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_clean_scrub_stamp":"2026-03-31T20:21:32.020592+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T01:46:03.277652+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00028135699999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.3","version":"13'1","reported_seq":21,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-31T20:21:35.038101+0000","last_change":"2026-03-31T20:21:35.038258+0000","last_active":"2026-03-31T20:21:35.038101+0000","last_peered":"2026-03-31T20:21:35.038101+0000","last_clean":"2026-03-31T20:21:35.038101+0000","last_became_active":"2026-03-31T20:21:33.038753+0000","last_became_peered":"2026-03-31T20:21:33.038753+0000","last_unstale":"2026-03-31T20:21:35.038101+0000","last_undegraded":"2026-03-31T20:21:35.038101+0000","last_fullsized":"2026-03-31T20:21:35.038101+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_clean_scrub_stamp":"2026-03-31T20:21:32.020592+0000","objects_scrubbed":0,"log_size":1,"log_dups_size":0,"ondisk_log_size":1,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-01T23:34:37.955562+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00057155900000000002,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"1.0","version":"10'32","reported_seq":65,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-31T20:21:35.037654+0000","last_change":"2026-03-31T20:21:30.016540+0000","last_active":"2026-03-31T20:21:35.037654+0000","last_peered":"2026-03-31T20:21:35.037654+0000","last_clean":"2026-03-31T20:21:35.037654+0000","last_became_active":"2026-03-31T20:21:30.016404+0000","last_became_peered":"2026-03-31T20:21:30.016404+0000","last_unstale":"2026-03-31T20:21:35.037654+0000","last_undegraded":"2026-03-31T20:21:35.037654+0000","last_fullsized":"2026-03-31T20:21:35.037654+0000","mapping_epoch":9,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":10,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:21:28.998269+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:21:28.998269+0000","last_clean_scrub_stamp":"2026-03-31T20:21:28.998269+0000","objects_scrubbed":0,"log_size":32,"log_dups_size":0,"ondisk_log_size":32,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T05:07:29.140176+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":38,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":3,"ondisk_log_size":3,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":475136,"data_stored":918560,"data_compressed":9608,"data_compressed_allocated":466944,"data_compressed_original":910368,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359738371,"num_pgs":3,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":26960,"kb_used_data":128,"kb_used_omap":8,"kb_used_meta":26807,"kb_avail":94344880,"statfs":{"total":96636764160,"available":96609157120,"internally_reserved":0,"allocated":131072,"data_stored":34339,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":8771,"internal_metadata":27450813},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":1,"up_from":8,"seq":34359738371,"num_pgs":9,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":27196,"kb_used_data":364,"kb_used_omap":8,"kb_used_meta":26807,"kb_avail":94344644,"statfs":{"total":96636764160,"available":96608915456,"internally_reserved":0,"allocated":372736,"data_stored":493638,"data_compressed":4804,"data_compressed_allocated":233472,"data_compressed_original":455184,"omap_allocated":8771,"internal_metadata":27450813},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":0,"up_from":8,"seq":34359738371,"num_pgs":6,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":27196,"kb_used_data":364,"kb_used_omap":8,"kb_used_meta":26807,"kb_avail":94344644,"statfs":{"total":96636764160,"available":96608915456,"internally_reserved":0,"allocated":372736,"data_stored":493638,"data_compressed":4804,"data_compressed_allocated":233472,"data_compressed_original":455184,"omap_allocated":8771,"internal_metadata":27450813},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":237568,"data_stored":459280,"data_compressed":4804,"data_compressed_allocated":233472,"data_compressed_original":455184,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":237568,"data_stored":459280,"data_compressed":4804,"data_compressed_allocated":233472,"data_compressed_original":455184,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-31T20:21:37.517 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-31T20:21:37.681 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:37.681 INFO:teuthology.orchestra.run.vm03.stderr:dumped all 2026-03-31T20:21:37.693 INFO:teuthology.orchestra.run.vm03.stdout:{"pg_ready":true,"pg_map":{"version":16,"stamp":"2026-03-31T20:21:36.524594+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459299,"num_objects":4,"num_object_clones":0,"num_object_copies":8,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":4,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":59,"num_write_kb":586,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":35,"ondisk_log_size":35,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":18,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":81352,"kb_used_data":856,"kb_used_omap":25,"kb_used_meta":80422,"kb_avail":283034168,"statfs":{"total":289910292480,"available":289826988032,"internally_reserved":0,"allocated":876544,"data_stored":1021615,"data_compressed":9608,"data_compressed_allocated":466944,"data_compressed_original":910368,"omap_allocated":26313,"internal_metadata":82352439},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":19,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"2.491587"},"pg_stats":[{"pgid":"2.7","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-31T20:21:35.038118+0000","last_change":"2026-03-31T20:21:35.038270+0000","last_active":"2026-03-31T20:21:35.038118+0000","last_peered":"2026-03-31T20:21:35.038118+0000","last_clean":"2026-03-31T20:21:35.038118+0000","last_became_active":"2026-03-31T20:21:33.038618+0000","last_became_peered":"2026-03-31T20:21:33.038618+0000","last_unstale":"2026-03-31T20:21:35.038118+0000","last_undegraded":"2026-03-31T20:21:35.038118+0000","last_fullsized":"2026-03-31T20:21:35.038118+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_clean_scrub_stamp":"2026-03-31T20:21:32.020592+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-01T21:37:49.889387+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00037397899999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.6","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-31T20:21:35.037699+0000","last_change":"2026-03-31T20:21:35.037923+0000","last_active":"2026-03-31T20:21:35.037699+0000","last_peered":"2026-03-31T20:21:35.037699+0000","last_clean":"2026-03-31T20:21:35.037699+0000","last_became_active":"2026-03-31T20:21:33.038527+0000","last_became_peered":"2026-03-31T20:21:33.038527+0000","last_unstale":"2026-03-31T20:21:35.037699+0000","last_undegraded":"2026-03-31T20:21:35.037699+0000","last_fullsized":"2026-03-31T20:21:35.037699+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_clean_scrub_stamp":"2026-03-31T20:21:32.020592+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-01T23:12:47.404171+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00038476,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.5","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-31T20:21:35.037945+0000","last_change":"2026-03-31T20:21:35.038056+0000","last_active":"2026-03-31T20:21:35.037945+0000","last_peered":"2026-03-31T20:21:35.037945+0000","last_clean":"2026-03-31T20:21:35.037945+0000","last_became_active":"2026-03-31T20:21:33.039071+0000","last_became_peered":"2026-03-31T20:21:33.039071+0000","last_unstale":"2026-03-31T20:21:35.037945+0000","last_undegraded":"2026-03-31T20:21:35.037945+0000","last_fullsized":"2026-03-31T20:21:35.037945+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_clean_scrub_stamp":"2026-03-31T20:21:32.020592+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T07:12:07.333081+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00042708900000000002,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.4","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-31T20:21:35.037778+0000","last_change":"2026-03-31T20:21:35.038061+0000","last_active":"2026-03-31T20:21:35.037778+0000","last_peered":"2026-03-31T20:21:35.037778+0000","last_clean":"2026-03-31T20:21:35.037778+0000","last_became_active":"2026-03-31T20:21:33.038780+0000","last_became_peered":"2026-03-31T20:21:33.038780+0000","last_unstale":"2026-03-31T20:21:35.037778+0000","last_undegraded":"2026-03-31T20:21:35.037778+0000","last_fullsized":"2026-03-31T20:21:35.037778+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_clean_scrub_stamp":"2026-03-31T20:21:32.020592+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T00:01:36.825157+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00049229099999999995,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.2","version":"15'2","reported_seq":22,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-31T20:21:35.037546+0000","last_change":"2026-03-31T20:21:35.037546+0000","last_active":"2026-03-31T20:21:35.037546+0000","last_peered":"2026-03-31T20:21:35.037546+0000","last_clean":"2026-03-31T20:21:35.037546+0000","last_became_active":"2026-03-31T20:21:33.036706+0000","last_became_peered":"2026-03-31T20:21:33.036706+0000","last_unstale":"2026-03-31T20:21:35.037546+0000","last_undegraded":"2026-03-31T20:21:35.037546+0000","last_fullsized":"2026-03-31T20:21:35.037546+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_clean_scrub_stamp":"2026-03-31T20:21:32.020592+0000","objects_scrubbed":0,"log_size":2,"log_dups_size":0,"ondisk_log_size":2,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T04:18:19.172037+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00026154900000000003,"stat_sum":{"num_bytes":19,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.1","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-31T20:21:35.041289+0000","last_change":"2026-03-31T20:21:35.041386+0000","last_active":"2026-03-31T20:21:35.041289+0000","last_peered":"2026-03-31T20:21:35.041289+0000","last_clean":"2026-03-31T20:21:35.041289+0000","last_became_active":"2026-03-31T20:21:33.036761+0000","last_became_peered":"2026-03-31T20:21:33.036761+0000","last_unstale":"2026-03-31T20:21:35.041289+0000","last_undegraded":"2026-03-31T20:21:35.041289+0000","last_fullsized":"2026-03-31T20:21:35.041289+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_clean_scrub_stamp":"2026-03-31T20:21:32.020592+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-01T22:00:11.952577+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00033548699999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.0","version":"0'0","reported_seq":20,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-31T20:21:35.041303+0000","last_change":"2026-03-31T20:21:35.041462+0000","last_active":"2026-03-31T20:21:35.041303+0000","last_peered":"2026-03-31T20:21:35.041303+0000","last_clean":"2026-03-31T20:21:35.041303+0000","last_became_active":"2026-03-31T20:21:33.036655+0000","last_became_peered":"2026-03-31T20:21:33.036655+0000","last_unstale":"2026-03-31T20:21:35.041303+0000","last_undegraded":"2026-03-31T20:21:35.041303+0000","last_fullsized":"2026-03-31T20:21:35.041303+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_clean_scrub_stamp":"2026-03-31T20:21:32.020592+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T01:46:03.277652+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00028135699999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"2.3","version":"13'1","reported_seq":21,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-31T20:21:35.038101+0000","last_change":"2026-03-31T20:21:35.038258+0000","last_active":"2026-03-31T20:21:35.038101+0000","last_peered":"2026-03-31T20:21:35.038101+0000","last_clean":"2026-03-31T20:21:35.038101+0000","last_became_active":"2026-03-31T20:21:33.038753+0000","last_became_peered":"2026-03-31T20:21:33.038753+0000","last_unstale":"2026-03-31T20:21:35.038101+0000","last_undegraded":"2026-03-31T20:21:35.038101+0000","last_fullsized":"2026-03-31T20:21:35.038101+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:21:32.020592+0000","last_clean_scrub_stamp":"2026-03-31T20:21:32.020592+0000","objects_scrubbed":0,"log_size":1,"log_dups_size":0,"ondisk_log_size":1,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-01T23:34:37.955562+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00057155900000000002,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[{"start":"2","length":"1"}]},{"pgid":"1.0","version":"10'32","reported_seq":65,"reported_epoch":15,"state":"active+clean","last_fresh":"2026-03-31T20:21:35.037654+0000","last_change":"2026-03-31T20:21:30.016540+0000","last_active":"2026-03-31T20:21:35.037654+0000","last_peered":"2026-03-31T20:21:35.037654+0000","last_clean":"2026-03-31T20:21:35.037654+0000","last_became_active":"2026-03-31T20:21:30.016404+0000","last_became_peered":"2026-03-31T20:21:30.016404+0000","last_unstale":"2026-03-31T20:21:35.037654+0000","last_undegraded":"2026-03-31T20:21:35.037654+0000","last_fullsized":"2026-03-31T20:21:35.037654+0000","mapping_epoch":9,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":10,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:21:28.998269+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:21:28.998269+0000","last_clean_scrub_stamp":"2026-03-31T20:21:28.998269+0000","objects_scrubbed":0,"log_size":32,"log_dups_size":0,"ondisk_log_size":32,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T05:07:29.140176+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":38,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":3,"ondisk_log_size":3,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":46,"num_read_kb":37,"num_write":57,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":475136,"data_stored":918560,"data_compressed":9608,"data_compressed_allocated":466944,"data_compressed_original":910368,"omap_allocated":0,"internal_metadata":0},"log_size":32,"ondisk_log_size":32,"up":2,"acting":2,"num_store_stats":2}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359738371,"num_pgs":3,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":26960,"kb_used_data":128,"kb_used_omap":8,"kb_used_meta":26807,"kb_avail":94344880,"statfs":{"total":96636764160,"available":96609157120,"internally_reserved":0,"allocated":131072,"data_stored":34339,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":8771,"internal_metadata":27450813},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":1,"up_from":8,"seq":34359738371,"num_pgs":9,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":27196,"kb_used_data":364,"kb_used_omap":8,"kb_used_meta":26807,"kb_avail":94344644,"statfs":{"total":96636764160,"available":96608915456,"internally_reserved":0,"allocated":372736,"data_stored":493638,"data_compressed":4804,"data_compressed_allocated":233472,"data_compressed_original":455184,"omap_allocated":8771,"internal_metadata":27450813},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":0,"up_from":8,"seq":34359738371,"num_pgs":6,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":27196,"kb_used_data":364,"kb_used_omap":8,"kb_used_meta":26807,"kb_avail":94344644,"statfs":{"total":96636764160,"available":96608915456,"internally_reserved":0,"allocated":372736,"data_stored":493638,"data_compressed":4804,"data_compressed_allocated":233472,"data_compressed_original":455184,"omap_allocated":8771,"internal_metadata":27450813},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":237568,"data_stored":459280,"data_compressed":4804,"data_compressed_allocated":233472,"data_compressed_original":455184,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":237568,"data_stored":459280,"data_compressed":4804,"data_compressed_allocated":233472,"data_compressed_original":455184,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-31T20:21:37.694 INFO:tasks.ceph.ceph_manager.ceph:clean! 2026-03-31T20:21:37.694 INFO:tasks.ceph:Waiting until ceph cluster ceph is healthy... 2026-03-31T20:21:37.694 INFO:tasks.ceph.ceph_manager.ceph:wait_until_healthy 2026-03-31T20:21:37.694 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph health --format=json 2026-03-31T20:21:37.878 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:21:37.878 INFO:teuthology.orchestra.run.vm03.stdout:{"status":"HEALTH_OK","checks":{},"mutes":[]} 2026-03-31T20:21:37.891 INFO:tasks.ceph.ceph_manager.ceph:wait_until_healthy done 2026-03-31T20:21:37.891 INFO:teuthology.run_tasks:Running task workunit... 2026-03-31T20:21:37.895 INFO:tasks.workunit:Pulling workunits from ref 0392f78529848ec72469e8e431875cb98d3a5fb4 2026-03-31T20:21:37.895 INFO:tasks.workunit:Making a separate scratch dir for every client... 2026-03-31T20:21:37.895 INFO:tasks.workunit:timeout=3h 2026-03-31T20:21:37.895 INFO:tasks.workunit:cleanup=True 2026-03-31T20:21:37.895 DEBUG:teuthology.orchestra.run.vm03:> stat -- /home/ubuntu/cephtest/mnt.0 2026-03-31T20:21:37.898 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-31T20:21:37.898 INFO:teuthology.orchestra.run.vm03.stderr:stat: cannot statx '/home/ubuntu/cephtest/mnt.0': No such file or directory 2026-03-31T20:21:37.898 DEBUG:teuthology.orchestra.run.vm03:> mkdir -- /home/ubuntu/cephtest/mnt.0 2026-03-31T20:21:37.945 INFO:tasks.workunit:Created dir /home/ubuntu/cephtest/mnt.0 2026-03-31T20:21:37.945 DEBUG:teuthology.orchestra.run.vm03:> cd -- /home/ubuntu/cephtest/mnt.0 && mkdir -- client.0 2026-03-31T20:21:37.989 DEBUG:teuthology.orchestra.run.vm03:> rm -rf /home/ubuntu/cephtest/clone.client.0 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.0 && cd /home/ubuntu/cephtest/clone.client.0 && git checkout 0392f78529848ec72469e8e431875cb98d3a5fb4 2026-03-31T20:21:38.034 INFO:tasks.workunit.client.0.vm03.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.0'... 2026-03-31T20:22:15.574 INFO:tasks.workunit.client.0.vm03.stderr:Note: switching to '0392f78529848ec72469e8e431875cb98d3a5fb4'. 2026-03-31T20:22:15.574 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:22:15.574 INFO:tasks.workunit.client.0.vm03.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-31T20:22:15.574 INFO:tasks.workunit.client.0.vm03.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-31T20:22:15.574 INFO:tasks.workunit.client.0.vm03.stderr:state without impacting any branches by switching back to a branch. 2026-03-31T20:22:15.574 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:22:15.574 INFO:tasks.workunit.client.0.vm03.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-31T20:22:15.574 INFO:tasks.workunit.client.0.vm03.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-31T20:22:15.574 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:22:15.574 INFO:tasks.workunit.client.0.vm03.stderr: git switch -c 2026-03-31T20:22:15.574 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:22:15.575 INFO:tasks.workunit.client.0.vm03.stderr:Or undo this operation with: 2026-03-31T20:22:15.575 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:22:15.575 INFO:tasks.workunit.client.0.vm03.stderr: git switch - 2026-03-31T20:22:15.575 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:22:15.575 INFO:tasks.workunit.client.0.vm03.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-31T20:22:15.575 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:22:15.575 INFO:tasks.workunit.client.0.vm03.stderr:HEAD is now at 0392f785298 qa/tasks/keystone: restart mariadb for rocky and alma linux too 2026-03-31T20:22:15.581 DEBUG:teuthology.orchestra.run.vm03:> cd -- /home/ubuntu/cephtest/clone.client.0/qa/workunits && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.0 2026-03-31T20:22:15.625 INFO:tasks.workunit.client.0.vm03.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2026-03-31T20:22:15.627 INFO:tasks.workunit.client.0.vm03.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-31T20:22:15.627 INFO:tasks.workunit.client.0.vm03.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2026-03-31T20:22:15.667 INFO:tasks.workunit.client.0.vm03.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2026-03-31T20:22:15.696 INFO:tasks.workunit.client.0.vm03.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2026-03-31T20:22:15.720 INFO:tasks.workunit.client.0.vm03.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-31T20:22:15.720 INFO:tasks.workunit.client.0.vm03.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-31T20:22:15.720 INFO:tasks.workunit.client.0.vm03.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2026-03-31T20:22:15.745 INFO:tasks.workunit.client.0.vm03.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-31T20:22:15.747 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-31T20:22:15.747 DEBUG:teuthology.orchestra.run.vm03:> dd if=/home/ubuntu/cephtest/workunits.list.client.0 of=/dev/stdout 2026-03-31T20:22:15.794 INFO:tasks.workunit:Running workunits matching cephtool on client.0... 2026-03-31T20:22:15.794 INFO:tasks.workunit:Running workunit cephtool/test.sh... 2026-03-31T20:22:15.794 DEBUG:teuthology.orchestra.run.vm03:workunit test cephtool/test.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=0392f78529848ec72469e8e431875cb98d3a5fb4 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh 2026-03-31T20:22:15.841 INFO:tasks.workunit.client.0.vm03.stderr:++ dirname /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh 2026-03-31T20:22:15.842 INFO:tasks.workunit.client.0.vm03.stderr:+ source /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh 2026-03-31T20:22:15.842 INFO:tasks.workunit.client.0.vm03.stderr:++ TIMEOUT=300 2026-03-31T20:22:15.842 INFO:tasks.workunit.client.0.vm03.stderr:++ WAIT_FOR_CLEAN_TIMEOUT=90 2026-03-31T20:22:15.842 INFO:tasks.workunit.client.0.vm03.stderr:++ MAX_TIMEOUT=15 2026-03-31T20:22:15.842 INFO:tasks.workunit.client.0.vm03.stderr:++ PG_NUM=4 2026-03-31T20:22:15.842 INFO:tasks.workunit.client.0.vm03.stderr:++ TMPDIR=/tmp 2026-03-31T20:22:15.842 INFO:tasks.workunit.client.0.vm03.stderr:++ CEPH_BUILD_VIRTUALENV=/tmp 2026-03-31T20:22:15.842 INFO:tasks.workunit.client.0.vm03.stderr:++ TESTDIR=/home/ubuntu/cephtest 2026-03-31T20:22:15.843 INFO:tasks.workunit.client.0.vm03.stderr:+++ uname 2026-03-31T20:22:15.843 INFO:tasks.workunit.client.0.vm03.stderr:++ '[' Linux = FreeBSD ']' 2026-03-31T20:22:15.843 INFO:tasks.workunit.client.0.vm03.stderr:++ SED=sed 2026-03-31T20:22:15.843 INFO:tasks.workunit.client.0.vm03.stderr:++ AWK=awk 2026-03-31T20:22:15.844 INFO:tasks.workunit.client.0.vm03.stderr:+++ stty -a 2026-03-31T20:22:15.844 INFO:tasks.workunit.client.0.vm03.stderr:+++ head -1 2026-03-31T20:22:15.844 INFO:tasks.workunit.client.0.vm03.stderr:+++ sed -e 's/.*columns \([0-9]*\).*/\1/' 2026-03-31T20:22:15.845 INFO:tasks.workunit.client.0.vm03.stderr:stty: 'standard input': Inappropriate ioctl for device 2026-03-31T20:22:15.845 INFO:tasks.workunit.client.0.vm03.stderr:++ termwidth= 2026-03-31T20:22:15.845 INFO:tasks.workunit.client.0.vm03.stderr:++ '[' -n '' -a '' '!=' 0 ']' 2026-03-31T20:22:15.845 INFO:tasks.workunit.client.0.vm03.stderr:++ DIFFCOLOPTS='-y ' 2026-03-31T20:22:15.845 INFO:tasks.workunit.client.0.vm03.stderr:++ KERNCORE=kernel.core_pattern 2026-03-31T20:22:15.845 INFO:tasks.workunit.client.0.vm03.stderr:++ EXTRA_OPTS= 2026-03-31T20:22:15.847 INFO:tasks.workunit.client.0.vm03.stderr:++ test '' = TESTS 2026-03-31T20:22:15.847 INFO:tasks.workunit.client.0.vm03.stderr:+ set -e 2026-03-31T20:22:15.847 INFO:tasks.workunit.client.0.vm03.stderr:+ set -o functrace 2026-03-31T20:22:15.847 INFO:tasks.workunit.client.0.vm03.stderr:+ PS4='${BASH_SOURCE[0]}:$LINENO: ${FUNCNAME[0]}: ' 2026-03-31T20:22:15.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:11: : SUDO=sudo 2026-03-31T20:22:15.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:12: : export CEPH_DEV=1 2026-03-31T20:22:15.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:12: : CEPH_DEV=1 2026-03-31T20:22:15.847 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:45: : mktemp -d /tmp/cephtool.XXX 2026-03-31T20:22:15.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:45: : TEMP_DIR=/tmp/cephtool.sYl 2026-03-31T20:22:15.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:46: : trap 'rm -fr /tmp/cephtool.sYl' 0 2026-03-31T20:22:15.848 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:48: : mktemp /tmp/cephtool.sYl/test_invalid.XXX 2026-03-31T20:22:15.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:48: : TMPFILE=/tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:22:15.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2964: : set +x 2026-03-31T20:22:16.052 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' already exists 2026-03-31T20:22:16.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mon_injectargs 2026-03-31T20:22:16.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:201: test_mon_injectargs: ceph tell osd.0 injectargs --no-osd_enable_op_tracker 2026-03-31T20:22:16.776 INFO:tasks.workunit.client.0.vm03.stdout:{} 2026-03-31T20:22:16.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:202: test_mon_injectargs: grep false 2026-03-31T20:22:16.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:202: test_mon_injectargs: ceph tell osd.0 config get osd_enable_op_tracker 2026-03-31T20:22:16.868 INFO:tasks.workunit.client.0.vm03.stdout: "osd_enable_op_tracker": "false" 2026-03-31T20:22:16.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:203: test_mon_injectargs: ceph tell osd.0 injectargs '--osd_enable_op_tracker --osd_op_history_duration 500' 2026-03-31T20:22:16.943 INFO:tasks.workunit.client.0.vm03.stdout:{} 2026-03-31T20:22:16.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:204: test_mon_injectargs: ceph tell osd.0 config get osd_enable_op_tracker 2026-03-31T20:22:16.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:204: test_mon_injectargs: grep true 2026-03-31T20:22:17.031 INFO:tasks.workunit.client.0.vm03.stdout: "osd_enable_op_tracker": "true" 2026-03-31T20:22:17.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:205: test_mon_injectargs: ceph tell osd.0 config get osd_op_history_duration 2026-03-31T20:22:17.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:205: test_mon_injectargs: grep 500 2026-03-31T20:22:17.111 INFO:tasks.workunit.client.0.vm03.stdout: "osd_op_history_duration": "500" 2026-03-31T20:22:17.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:206: test_mon_injectargs: ceph tell osd.0 injectargs --no-osd_enable_op_tracker 2026-03-31T20:22:17.183 INFO:tasks.workunit.client.0.vm03.stdout:{} 2026-03-31T20:22:17.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:207: test_mon_injectargs: ceph tell osd.0 config get osd_enable_op_tracker 2026-03-31T20:22:17.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:207: test_mon_injectargs: grep false 2026-03-31T20:22:17.271 INFO:tasks.workunit.client.0.vm03.stdout: "osd_enable_op_tracker": "false" 2026-03-31T20:22:17.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:208: test_mon_injectargs: ceph tell osd.0 injectargs -- --osd_enable_op_tracker 2026-03-31T20:22:17.344 INFO:tasks.workunit.client.0.vm03.stdout:{} 2026-03-31T20:22:17.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:209: test_mon_injectargs: ceph tell osd.0 config get osd_enable_op_tracker 2026-03-31T20:22:17.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:209: test_mon_injectargs: grep true 2026-03-31T20:22:17.441 INFO:tasks.workunit.client.0.vm03.stdout: "osd_enable_op_tracker": "true" 2026-03-31T20:22:17.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:210: test_mon_injectargs: ceph tell osd.0 injectargs -- '--osd_enable_op_tracker --osd_op_history_duration 600' 2026-03-31T20:22:17.517 INFO:tasks.workunit.client.0.vm03.stdout:{} 2026-03-31T20:22:17.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:211: test_mon_injectargs: ceph tell osd.0 config get osd_enable_op_tracker 2026-03-31T20:22:17.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:211: test_mon_injectargs: grep true 2026-03-31T20:22:17.612 INFO:tasks.workunit.client.0.vm03.stdout: "osd_enable_op_tracker": "true" 2026-03-31T20:22:17.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:212: test_mon_injectargs: ceph tell osd.0 config get osd_op_history_duration 2026-03-31T20:22:17.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:212: test_mon_injectargs: grep 600 2026-03-31T20:22:17.698 INFO:tasks.workunit.client.0.vm03.stdout: "osd_op_history_duration": "600" 2026-03-31T20:22:17.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:214: test_mon_injectargs: ceph tell osd.0 injectargs -- '--osd_deep_scrub_interval 2419200' 2026-03-31T20:22:17.776 INFO:tasks.workunit.client.0.vm03.stdout:{} 2026-03-31T20:22:17.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:215: test_mon_injectargs: ceph tell osd.0 config get osd_deep_scrub_interval 2026-03-31T20:22:17.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:215: test_mon_injectargs: grep 2419200 2026-03-31T20:22:17.866 INFO:tasks.workunit.client.0.vm03.stdout: "osd_deep_scrub_interval": "2419200.000000" 2026-03-31T20:22:17.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:217: test_mon_injectargs: ceph tell osd.0 injectargs -- '--mon_probe_timeout 2' 2026-03-31T20:22:17.939 INFO:tasks.workunit.client.0.vm03.stdout:{} 2026-03-31T20:22:17.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:218: test_mon_injectargs: ceph tell osd.0 config get mon_probe_timeout 2026-03-31T20:22:17.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:218: test_mon_injectargs: grep 2 2026-03-31T20:22:18.028 INFO:tasks.workunit.client.0.vm03.stdout: "mon_probe_timeout": "2.000000" 2026-03-31T20:22:18.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:220: test_mon_injectargs: ceph tell osd.0 injectargs -- '--mon-lease 6' 2026-03-31T20:22:18.101 INFO:tasks.workunit.client.0.vm03.stdout:{} 2026-03-31T20:22:18.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:221: test_mon_injectargs: ceph tell osd.0 config get mon_lease 2026-03-31T20:22:18.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:221: test_mon_injectargs: grep 6 2026-03-31T20:22:18.190 INFO:tasks.workunit.client.0.vm03.stdout: "mon_lease": "6.000000" 2026-03-31T20:22:18.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:224: test_mon_injectargs: expect_false ceph tell osd.0 injectargs --osd-scrub-auto-repair-num-errors -1 2026-03-31T20:22:18.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:225: test_mon_injectargs: check_response 'Error EINVAL: Parse error setting osd_scrub_auto_repair_num_errors to '\''-1'\'' using injectargs' 2026-03-31T20:22:18.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='Error EINVAL: Parse error setting osd_scrub_auto_repair_num_errors to '\''-1'\'' using injectargs' 2026-03-31T20:22:18.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode= 2026-03-31T20:22:18.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode= 2026-03-31T20:22:18.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' '' -a '!=' ']' 2026-03-31T20:22:18.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'Error EINVAL: Parse error setting osd_scrub_auto_repair_num_errors to '\''-1'\'' using injectargs' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:22:18.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:227: test_mon_injectargs: expect_failure /tmp/cephtool.sYl 'Option --osd_op_history_duration requires an argument' ceph tell osd.0 injectargs -- --osd_op_history_duration 2026-03-31T20:22:18.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2021: expect_failure: local dir=/tmp/cephtool.sYl 2026-03-31T20:22:18.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2022: expect_failure: shift 2026-03-31T20:22:18.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2023: expect_failure: local 'expected=Option --osd_op_history_duration requires an argument' 2026-03-31T20:22:18.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2024: expect_failure: shift 2026-03-31T20:22:18.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2025: expect_failure: local success 2026-03-31T20:22:18.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2027: expect_failure: ceph tell osd.0 injectargs -- --osd_op_history_duration 2026-03-31T20:22:18.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2030: expect_failure: success=false 2026-03-31T20:22:18.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2033: expect_failure: false 2026-03-31T20:22:18.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2033: expect_failure: grep --quiet 'Option --osd_op_history_duration requires an argument' /tmp/cephtool.sYl/out 2026-03-31T20:22:18.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2037: expect_failure: return 0 2026-03-31T20:22:18.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:22:18.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mon_injectargs_SI 2026-03-31T20:22:18.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:241: test_mon_injectargs_SI: get_config_value_or_die mon.a mon_pg_warn_min_objects 2026-03-31T20:22:18.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:120: get_config_value_or_die: local target config_opt raw val 2026-03-31T20:22:18.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:122: get_config_value_or_die: target=mon.a 2026-03-31T20:22:18.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:123: get_config_value_or_die: config_opt=mon_pg_warn_min_objects 2026-03-31T20:22:18.543 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:125: get_config_value_or_die: sudo ceph daemon mon.a config get mon_pg_warn_min_objects 2026-03-31T20:22:18.623 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:125: get_config_value_or_die: raw='{ 2026-03-31T20:22:18.624 INFO:tasks.workunit.client.0.vm03.stderr: "mon_pg_warn_min_objects": "10000" 2026-03-31T20:22:18.624 INFO:tasks.workunit.client.0.vm03.stderr:}' 2026-03-31T20:22:18.624 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:126: get_config_value_or_die: [[ 0 -ne 0 ]] 2026-03-31T20:22:18.624 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:131: get_config_value_or_die: echo '{' '"mon_pg_warn_min_objects":' '"10000"' '}' 2026-03-31T20:22:18.624 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:131: get_config_value_or_die: sed -e 's/[{} "]//g' 2026-03-31T20:22:18.625 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:131: get_config_value_or_die: raw=mon_pg_warn_min_objects:10000 2026-03-31T20:22:18.625 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:132: get_config_value_or_die: echo mon_pg_warn_min_objects:10000 2026-03-31T20:22:18.625 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:132: get_config_value_or_die: cut -f2 -d: 2026-03-31T20:22:18.626 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:132: get_config_value_or_die: val=10000 2026-03-31T20:22:18.626 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:134: get_config_value_or_die: echo 10000 2026-03-31T20:22:18.626 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:135: get_config_value_or_die: return 0 2026-03-31T20:22:18.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:241: test_mon_injectargs_SI: initial_value=10000 2026-03-31T20:22:18.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:242: test_mon_injectargs_SI: sudo ceph daemon mon.a config set mon_pg_warn_min_objects 10 2026-03-31T20:22:18.692 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:22:18.692 INFO:tasks.workunit.client.0.vm03.stdout: "success": "mon_pg_warn_min_objects = '' (not observed, change may require restart) " 2026-03-31T20:22:18.692 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:22:18.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:243: test_mon_injectargs_SI: expect_config_value mon.a mon_pg_warn_min_objects 10 2026-03-31T20:22:18.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:140: expect_config_value: local target config_opt expected_val val 2026-03-31T20:22:18.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:141: expect_config_value: target=mon.a 2026-03-31T20:22:18.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:142: expect_config_value: config_opt=mon_pg_warn_min_objects 2026-03-31T20:22:18.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:143: expect_config_value: expected_val=10 2026-03-31T20:22:18.701 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:145: expect_config_value: get_config_value_or_die mon.a mon_pg_warn_min_objects 2026-03-31T20:22:18.701 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:120: get_config_value_or_die: local target config_opt raw val 2026-03-31T20:22:18.701 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:122: get_config_value_or_die: target=mon.a 2026-03-31T20:22:18.701 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:123: get_config_value_or_die: config_opt=mon_pg_warn_min_objects 2026-03-31T20:22:18.701 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:125: get_config_value_or_die: sudo ceph daemon mon.a config get mon_pg_warn_min_objects 2026-03-31T20:22:18.776 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:125: get_config_value_or_die: raw='{ 2026-03-31T20:22:18.776 INFO:tasks.workunit.client.0.vm03.stderr: "mon_pg_warn_min_objects": "10" 2026-03-31T20:22:18.776 INFO:tasks.workunit.client.0.vm03.stderr:}' 2026-03-31T20:22:18.776 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:126: get_config_value_or_die: [[ 0 -ne 0 ]] 2026-03-31T20:22:18.776 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:131: get_config_value_or_die: echo '{' '"mon_pg_warn_min_objects":' '"10"' '}' 2026-03-31T20:22:18.776 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:131: get_config_value_or_die: sed -e 's/[{} "]//g' 2026-03-31T20:22:18.777 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:131: get_config_value_or_die: raw=mon_pg_warn_min_objects:10 2026-03-31T20:22:18.777 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:132: get_config_value_or_die: echo mon_pg_warn_min_objects:10 2026-03-31T20:22:18.777 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:132: get_config_value_or_die: cut -f2 -d: 2026-03-31T20:22:18.779 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:132: get_config_value_or_die: val=10 2026-03-31T20:22:18.779 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:134: get_config_value_or_die: echo 10 2026-03-31T20:22:18.779 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:135: get_config_value_or_die: return 0 2026-03-31T20:22:18.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:145: expect_config_value: val=10 2026-03-31T20:22:18.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:147: expect_config_value: [[ 10 != \1\0 ]] 2026-03-31T20:22:18.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:244: test_mon_injectargs_SI: sudo ceph daemon mon.a config set mon_pg_warn_min_objects 10K 2026-03-31T20:22:18.847 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:22:18.847 INFO:tasks.workunit.client.0.vm03.stdout: "success": "mon_pg_warn_min_objects = '' (not observed, change may require restart) " 2026-03-31T20:22:18.847 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:22:18.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:245: test_mon_injectargs_SI: expect_config_value mon.a mon_pg_warn_min_objects 10000 2026-03-31T20:22:18.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:140: expect_config_value: local target config_opt expected_val val 2026-03-31T20:22:18.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:141: expect_config_value: target=mon.a 2026-03-31T20:22:18.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:142: expect_config_value: config_opt=mon_pg_warn_min_objects 2026-03-31T20:22:18.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:143: expect_config_value: expected_val=10000 2026-03-31T20:22:18.858 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:145: expect_config_value: get_config_value_or_die mon.a mon_pg_warn_min_objects 2026-03-31T20:22:18.858 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:120: get_config_value_or_die: local target config_opt raw val 2026-03-31T20:22:18.858 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:122: get_config_value_or_die: target=mon.a 2026-03-31T20:22:18.858 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:123: get_config_value_or_die: config_opt=mon_pg_warn_min_objects 2026-03-31T20:22:18.858 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:125: get_config_value_or_die: sudo ceph daemon mon.a config get mon_pg_warn_min_objects 2026-03-31T20:22:18.932 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:125: get_config_value_or_die: raw='{ 2026-03-31T20:22:18.933 INFO:tasks.workunit.client.0.vm03.stderr: "mon_pg_warn_min_objects": "10000" 2026-03-31T20:22:18.933 INFO:tasks.workunit.client.0.vm03.stderr:}' 2026-03-31T20:22:18.933 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:126: get_config_value_or_die: [[ 0 -ne 0 ]] 2026-03-31T20:22:18.933 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:131: get_config_value_or_die: echo '{' '"mon_pg_warn_min_objects":' '"10000"' '}' 2026-03-31T20:22:18.933 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:131: get_config_value_or_die: sed -e 's/[{} "]//g' 2026-03-31T20:22:18.934 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:131: get_config_value_or_die: raw=mon_pg_warn_min_objects:10000 2026-03-31T20:22:18.934 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:132: get_config_value_or_die: echo mon_pg_warn_min_objects:10000 2026-03-31T20:22:18.934 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:132: get_config_value_or_die: cut -f2 -d: 2026-03-31T20:22:18.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:132: get_config_value_or_die: val=10000 2026-03-31T20:22:18.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:134: get_config_value_or_die: echo 10000 2026-03-31T20:22:18.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:135: get_config_value_or_die: return 0 2026-03-31T20:22:18.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:145: expect_config_value: val=10000 2026-03-31T20:22:18.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:147: expect_config_value: [[ 10000 != \1\0\0\0\0 ]] 2026-03-31T20:22:18.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:246: test_mon_injectargs_SI: sudo ceph daemon mon.a config set mon_pg_warn_min_objects 1G 2026-03-31T20:22:19.004 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:22:19.004 INFO:tasks.workunit.client.0.vm03.stdout: "success": "mon_pg_warn_min_objects = '' (not observed, change may require restart) " 2026-03-31T20:22:19.004 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:22:19.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:247: test_mon_injectargs_SI: expect_config_value mon.a mon_pg_warn_min_objects 1000000000 2026-03-31T20:22:19.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:140: expect_config_value: local target config_opt expected_val val 2026-03-31T20:22:19.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:141: expect_config_value: target=mon.a 2026-03-31T20:22:19.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:142: expect_config_value: config_opt=mon_pg_warn_min_objects 2026-03-31T20:22:19.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:143: expect_config_value: expected_val=1000000000 2026-03-31T20:22:19.013 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:145: expect_config_value: get_config_value_or_die mon.a mon_pg_warn_min_objects 2026-03-31T20:22:19.013 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:120: get_config_value_or_die: local target config_opt raw val 2026-03-31T20:22:19.013 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:122: get_config_value_or_die: target=mon.a 2026-03-31T20:22:19.013 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:123: get_config_value_or_die: config_opt=mon_pg_warn_min_objects 2026-03-31T20:22:19.013 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:125: get_config_value_or_die: sudo ceph daemon mon.a config get mon_pg_warn_min_objects 2026-03-31T20:22:19.085 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:125: get_config_value_or_die: raw='{ 2026-03-31T20:22:19.085 INFO:tasks.workunit.client.0.vm03.stderr: "mon_pg_warn_min_objects": "1000000000" 2026-03-31T20:22:19.085 INFO:tasks.workunit.client.0.vm03.stderr:}' 2026-03-31T20:22:19.085 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:126: get_config_value_or_die: [[ 0 -ne 0 ]] 2026-03-31T20:22:19.085 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:131: get_config_value_or_die: echo '{' '"mon_pg_warn_min_objects":' '"1000000000"' '}' 2026-03-31T20:22:19.085 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:131: get_config_value_or_die: sed -e 's/[{} "]//g' 2026-03-31T20:22:19.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:131: get_config_value_or_die: raw=mon_pg_warn_min_objects:1000000000 2026-03-31T20:22:19.086 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:132: get_config_value_or_die: echo mon_pg_warn_min_objects:1000000000 2026-03-31T20:22:19.086 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:132: get_config_value_or_die: cut -f2 -d: 2026-03-31T20:22:19.087 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:132: get_config_value_or_die: val=1000000000 2026-03-31T20:22:19.087 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:134: get_config_value_or_die: echo 1000000000 2026-03-31T20:22:19.087 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:135: get_config_value_or_die: return 0 2026-03-31T20:22:19.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:145: expect_config_value: val=1000000000 2026-03-31T20:22:19.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:147: expect_config_value: [[ 1000000000 != \1\0\0\0\0\0\0\0\0\0 ]] 2026-03-31T20:22:19.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:248: test_mon_injectargs_SI: sudo ceph daemon mon.a config set mon_pg_warn_min_objects 10F 2026-03-31T20:22:19.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:249: test_mon_injectargs_SI: check_response '(22) Invalid argument' 2026-03-31T20:22:19.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='(22) Invalid argument' 2026-03-31T20:22:19.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode= 2026-03-31T20:22:19.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode= 2026-03-31T20:22:19.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' '' -a '!=' ']' 2026-03-31T20:22:19.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- '(22) Invalid argument' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:22:19.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:251: test_mon_injectargs_SI: ceph tell mon.a injectargs '--mon_pg_warn_min_objects 10' 2026-03-31T20:22:19.231 INFO:tasks.workunit.client.0.vm03.stdout:{} 2026-03-31T20:22:19.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:252: test_mon_injectargs_SI: expect_config_value mon.a mon_pg_warn_min_objects 10 2026-03-31T20:22:19.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:140: expect_config_value: local target config_opt expected_val val 2026-03-31T20:22:19.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:141: expect_config_value: target=mon.a 2026-03-31T20:22:19.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:142: expect_config_value: config_opt=mon_pg_warn_min_objects 2026-03-31T20:22:19.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:143: expect_config_value: expected_val=10 2026-03-31T20:22:19.241 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:145: expect_config_value: get_config_value_or_die mon.a mon_pg_warn_min_objects 2026-03-31T20:22:19.241 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:120: get_config_value_or_die: local target config_opt raw val 2026-03-31T20:22:19.241 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:122: get_config_value_or_die: target=mon.a 2026-03-31T20:22:19.241 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:123: get_config_value_or_die: config_opt=mon_pg_warn_min_objects 2026-03-31T20:22:19.241 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:125: get_config_value_or_die: sudo ceph daemon mon.a config get mon_pg_warn_min_objects 2026-03-31T20:22:19.315 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:125: get_config_value_or_die: raw='{ 2026-03-31T20:22:19.315 INFO:tasks.workunit.client.0.vm03.stderr: "mon_pg_warn_min_objects": "10" 2026-03-31T20:22:19.315 INFO:tasks.workunit.client.0.vm03.stderr:}' 2026-03-31T20:22:19.315 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:126: get_config_value_or_die: [[ 0 -ne 0 ]] 2026-03-31T20:22:19.315 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:131: get_config_value_or_die: echo '{' '"mon_pg_warn_min_objects":' '"10"' '}' 2026-03-31T20:22:19.315 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:131: get_config_value_or_die: sed -e 's/[{} "]//g' 2026-03-31T20:22:19.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:131: get_config_value_or_die: raw=mon_pg_warn_min_objects:10 2026-03-31T20:22:19.316 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:132: get_config_value_or_die: echo mon_pg_warn_min_objects:10 2026-03-31T20:22:19.316 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:132: get_config_value_or_die: cut -f2 -d: 2026-03-31T20:22:19.317 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:132: get_config_value_or_die: val=10 2026-03-31T20:22:19.317 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:134: get_config_value_or_die: echo 10 2026-03-31T20:22:19.317 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:135: get_config_value_or_die: return 0 2026-03-31T20:22:19.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:145: expect_config_value: val=10 2026-03-31T20:22:19.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:147: expect_config_value: [[ 10 != \1\0 ]] 2026-03-31T20:22:19.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:253: test_mon_injectargs_SI: ceph tell mon.a injectargs '--mon_pg_warn_min_objects 10K' 2026-03-31T20:22:19.383 INFO:tasks.workunit.client.0.vm03.stdout:{} 2026-03-31T20:22:19.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:254: test_mon_injectargs_SI: expect_config_value mon.a mon_pg_warn_min_objects 10000 2026-03-31T20:22:19.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:140: expect_config_value: local target config_opt expected_val val 2026-03-31T20:22:19.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:141: expect_config_value: target=mon.a 2026-03-31T20:22:19.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:142: expect_config_value: config_opt=mon_pg_warn_min_objects 2026-03-31T20:22:19.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:143: expect_config_value: expected_val=10000 2026-03-31T20:22:19.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:145: expect_config_value: get_config_value_or_die mon.a mon_pg_warn_min_objects 2026-03-31T20:22:19.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:120: get_config_value_or_die: local target config_opt raw val 2026-03-31T20:22:19.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:122: get_config_value_or_die: target=mon.a 2026-03-31T20:22:19.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:123: get_config_value_or_die: config_opt=mon_pg_warn_min_objects 2026-03-31T20:22:19.392 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:125: get_config_value_or_die: sudo ceph daemon mon.a config get mon_pg_warn_min_objects 2026-03-31T20:22:19.466 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:125: get_config_value_or_die: raw='{ 2026-03-31T20:22:19.466 INFO:tasks.workunit.client.0.vm03.stderr: "mon_pg_warn_min_objects": "10000" 2026-03-31T20:22:19.466 INFO:tasks.workunit.client.0.vm03.stderr:}' 2026-03-31T20:22:19.466 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:126: get_config_value_or_die: [[ 0 -ne 0 ]] 2026-03-31T20:22:19.466 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:131: get_config_value_or_die: echo '{' '"mon_pg_warn_min_objects":' '"10000"' '}' 2026-03-31T20:22:19.466 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:131: get_config_value_or_die: sed -e 's/[{} "]//g' 2026-03-31T20:22:19.467 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:131: get_config_value_or_die: raw=mon_pg_warn_min_objects:10000 2026-03-31T20:22:19.467 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:132: get_config_value_or_die: echo mon_pg_warn_min_objects:10000 2026-03-31T20:22:19.467 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:132: get_config_value_or_die: cut -f2 -d: 2026-03-31T20:22:19.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:132: get_config_value_or_die: val=10000 2026-03-31T20:22:19.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:134: get_config_value_or_die: echo 10000 2026-03-31T20:22:19.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:135: get_config_value_or_die: return 0 2026-03-31T20:22:19.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:145: expect_config_value: val=10000 2026-03-31T20:22:19.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:147: expect_config_value: [[ 10000 != \1\0\0\0\0 ]] 2026-03-31T20:22:19.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:255: test_mon_injectargs_SI: ceph tell mon.a injectargs '--mon_pg_warn_min_objects 1G' 2026-03-31T20:22:19.534 INFO:tasks.workunit.client.0.vm03.stdout:{} 2026-03-31T20:22:19.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:256: test_mon_injectargs_SI: expect_config_value mon.a mon_pg_warn_min_objects 1000000000 2026-03-31T20:22:19.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:140: expect_config_value: local target config_opt expected_val val 2026-03-31T20:22:19.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:141: expect_config_value: target=mon.a 2026-03-31T20:22:19.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:142: expect_config_value: config_opt=mon_pg_warn_min_objects 2026-03-31T20:22:19.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:143: expect_config_value: expected_val=1000000000 2026-03-31T20:22:19.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:145: expect_config_value: get_config_value_or_die mon.a mon_pg_warn_min_objects 2026-03-31T20:22:19.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:120: get_config_value_or_die: local target config_opt raw val 2026-03-31T20:22:19.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:122: get_config_value_or_die: target=mon.a 2026-03-31T20:22:19.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:123: get_config_value_or_die: config_opt=mon_pg_warn_min_objects 2026-03-31T20:22:19.543 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:125: get_config_value_or_die: sudo ceph daemon mon.a config get mon_pg_warn_min_objects 2026-03-31T20:22:19.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:125: get_config_value_or_die: raw='{ 2026-03-31T20:22:19.619 INFO:tasks.workunit.client.0.vm03.stderr: "mon_pg_warn_min_objects": "1000000000" 2026-03-31T20:22:19.619 INFO:tasks.workunit.client.0.vm03.stderr:}' 2026-03-31T20:22:19.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:126: get_config_value_or_die: [[ 0 -ne 0 ]] 2026-03-31T20:22:19.619 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:131: get_config_value_or_die: echo '{' '"mon_pg_warn_min_objects":' '"1000000000"' '}' 2026-03-31T20:22:19.619 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:131: get_config_value_or_die: sed -e 's/[{} "]//g' 2026-03-31T20:22:19.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:131: get_config_value_or_die: raw=mon_pg_warn_min_objects:1000000000 2026-03-31T20:22:19.620 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:132: get_config_value_or_die: echo mon_pg_warn_min_objects:1000000000 2026-03-31T20:22:19.620 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:132: get_config_value_or_die: cut -f2 -d: 2026-03-31T20:22:19.621 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:132: get_config_value_or_die: val=1000000000 2026-03-31T20:22:19.621 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:134: get_config_value_or_die: echo 1000000000 2026-03-31T20:22:19.621 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:135: get_config_value_or_die: return 0 2026-03-31T20:22:19.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:145: expect_config_value: val=1000000000 2026-03-31T20:22:19.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:147: expect_config_value: [[ 1000000000 != \1\0\0\0\0\0\0\0\0\0 ]] 2026-03-31T20:22:19.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:257: test_mon_injectargs_SI: expect_false ceph tell mon.a injectargs '--mon_pg_warn_min_objects 10F' 2026-03-31T20:22:19.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:22:19.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph tell mon.a injectargs '--mon_pg_warn_min_objects 10F' 2026-03-31T20:22:19.684 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: Parse error setting mon_pg_warn_min_objects to '10F' using injectargs (strict_si_cast: unit prefix not recognized). 2026-03-31T20:22:19.684 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:22:19.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:22:19.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:258: test_mon_injectargs_SI: expect_false ceph tell mon.a injectargs '--mon_globalid_prealloc -1' 2026-03-31T20:22:19.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:22:19.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph tell mon.a injectargs '--mon_globalid_prealloc -1' 2026-03-31T20:22:19.748 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: Parse error setting mon_globalid_prealloc to '-1' using injectargs (strict_sistrtoll: value should not be negative). 2026-03-31T20:22:19.748 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:22:19.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:22:19.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:259: test_mon_injectargs_SI: sudo ceph daemon mon.a config set mon_pg_warn_min_objects 10000 2026-03-31T20:22:19.814 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:22:19.814 INFO:tasks.workunit.client.0.vm03.stdout: "success": "mon_pg_warn_min_objects = '' (not observed, change may require restart) " 2026-03-31T20:22:19.814 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:22:19.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:22:20.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_tiering_1 2026-03-31T20:22:20.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:339: test_tiering_1: ceph osd pool create slow 2 2026-03-31T20:22:20.647 INFO:tasks.workunit.client.0.vm03.stderr:pool 'slow' already exists 2026-03-31T20:22:20.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:340: test_tiering_1: ceph osd pool application enable slow rados 2026-03-31T20:22:22.600 INFO:tasks.workunit.client.0.vm03.stderr:enabled application 'rados' on pool 'slow' 2026-03-31T20:22:22.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:341: test_tiering_1: ceph osd pool create slow2 2 2026-03-31T20:22:23.675 INFO:tasks.workunit.client.0.vm03.stderr:pool 'slow2' already exists 2026-03-31T20:22:23.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:342: test_tiering_1: ceph osd pool application enable slow2 rados 2026-03-31T20:22:25.638 INFO:tasks.workunit.client.0.vm03.stderr:enabled application 'rados' on pool 'slow2' 2026-03-31T20:22:25.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:343: test_tiering_1: ceph osd pool create cache 2 2026-03-31T20:22:26.708 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cache' already exists 2026-03-31T20:22:26.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:344: test_tiering_1: ceph osd pool create cache2 2 2026-03-31T20:22:27.714 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cache2' already exists 2026-03-31T20:22:27.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:345: test_tiering_1: ceph osd tier add slow cache 2026-03-31T20:22:28.727 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cache' is now (or already was) a tier of 'slow' 2026-03-31T20:22:28.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:346: test_tiering_1: ceph osd tier add slow cache2 2026-03-31T20:22:29.723 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cache2' is now (or already was) a tier of 'slow' 2026-03-31T20:22:29.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:347: test_tiering_1: expect_false ceph osd tier add slow2 cache 2026-03-31T20:22:29.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:22:29.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd tier add slow2 cache 2026-03-31T20:22:29.892 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: tier pool 'cache' is already a tier of 'slow' 2026-03-31T20:22:29.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:22:29.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:349: test_tiering_1: ceph osd pool ls detail -f json 2026-03-31T20:22:29.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:349: test_tiering_1: jq '.[] | select(.pool_name == "slow") | .application_metadata["rados"]' 2026-03-31T20:22:29.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:349: test_tiering_1: grep '{}' 2026-03-31T20:22:30.108 INFO:tasks.workunit.client.0.vm03.stdout:{} 2026-03-31T20:22:30.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:350: test_tiering_1: ceph osd pool ls detail -f json 2026-03-31T20:22:30.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:350: test_tiering_1: jq '.[] | select(.pool_name == "slow2") | .application_metadata["rados"]' 2026-03-31T20:22:30.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:350: test_tiering_1: grep '{}' 2026-03-31T20:22:30.319 INFO:tasks.workunit.client.0.vm03.stdout:{} 2026-03-31T20:22:30.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:351: test_tiering_1: ceph osd pool ls detail -f json 2026-03-31T20:22:30.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:351: test_tiering_1: jq '.[] | select(.pool_name == "cache") | .application_metadata["rados"]' 2026-03-31T20:22:30.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:351: test_tiering_1: grep '{}' 2026-03-31T20:22:30.532 INFO:tasks.workunit.client.0.vm03.stdout:{} 2026-03-31T20:22:30.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:352: test_tiering_1: ceph osd pool ls detail -f json 2026-03-31T20:22:30.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:352: test_tiering_1: jq '.[] | select(.pool_name == "cache2") | .application_metadata["rados"]' 2026-03-31T20:22:30.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:352: test_tiering_1: grep '{}' 2026-03-31T20:22:30.756 INFO:tasks.workunit.client.0.vm03.stdout:{} 2026-03-31T20:22:30.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:354: test_tiering_1: expect_false ceph osd tier cache-mode cache forward 2026-03-31T20:22:30.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:22:30.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd tier cache-mode cache forward 2026-03-31T20:22:30.913 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: forward not in writeback|proxy|readproxy|readonly|none 2026-03-31T20:22:30.913 INFO:tasks.workunit.client.0.vm03.stderr:osd tier cache-mode [--yes-i-really-mean-it] : specify the caching mode for cache tier 2026-03-31T20:22:30.913 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:22:30.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:22:30.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:355: test_tiering_1: expect_false ceph osd tier cache-mode cache forward --yes-i-really-mean-it 2026-03-31T20:22:30.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:22:30.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd tier cache-mode cache forward --yes-i-really-mean-it 2026-03-31T20:22:31.072 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: forward not in writeback|proxy|readproxy|readonly|none 2026-03-31T20:22:31.072 INFO:tasks.workunit.client.0.vm03.stderr:osd tier cache-mode [--yes-i-really-mean-it] : specify the caching mode for cache tier 2026-03-31T20:22:31.072 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:22:31.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:22:31.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:357: test_tiering_1: ceph osd tier cache-mode cache writeback 2026-03-31T20:22:31.748 INFO:tasks.workunit.client.0.vm03.stderr:set cache-mode for pool 'cache' to writeback 2026-03-31T20:22:31.760 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:358: test_tiering_1: expect_false ceph osd tier cache-mode cache readonly 2026-03-31T20:22:31.760 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:22:31.760 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd tier cache-mode cache readonly 2026-03-31T20:22:31.914 INFO:tasks.workunit.client.0.vm03.stderr:Error EPERM: 'readonly' is not a well-supported cache mode and may corrupt your data. pass --yes-i-really-mean-it to force. 2026-03-31T20:22:31.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:22:31.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:359: test_tiering_1: expect_false ceph osd tier cache-mode cache readonly --yes-i-really-mean-it 2026-03-31T20:22:31.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:22:31.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd tier cache-mode cache readonly --yes-i-really-mean-it 2026-03-31T20:22:32.074 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: unable to set cache-mode 'readonly' on a 'writeback' pool; only 'proxy','readproxy' allowed. 2026-03-31T20:22:32.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:22:32.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:360: test_tiering_1: ceph osd tier cache-mode cache proxy 2026-03-31T20:22:32.755 INFO:tasks.workunit.client.0.vm03.stderr:set cache-mode for pool 'cache' to proxy 2026-03-31T20:22:32.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:361: test_tiering_1: ceph osd tier cache-mode cache readproxy 2026-03-31T20:22:33.766 INFO:tasks.workunit.client.0.vm03.stderr:set cache-mode for pool 'cache' to readproxy 2026-03-31T20:22:33.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:362: test_tiering_1: ceph osd tier cache-mode cache none 2026-03-31T20:22:34.762 INFO:tasks.workunit.client.0.vm03.stderr:set cache-mode for pool 'cache' to none 2026-03-31T20:22:34.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:363: test_tiering_1: ceph osd tier cache-mode cache readonly --yes-i-really-mean-it 2026-03-31T20:22:35.764 INFO:tasks.workunit.client.0.vm03.stderr:set cache-mode for pool 'cache' to readonly 2026-03-31T20:22:35.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:364: test_tiering_1: ceph osd tier cache-mode cache none 2026-03-31T20:22:36.773 INFO:tasks.workunit.client.0.vm03.stderr:set cache-mode for pool 'cache' to none 2026-03-31T20:22:36.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:365: test_tiering_1: ceph osd tier cache-mode cache writeback 2026-03-31T20:22:37.781 INFO:tasks.workunit.client.0.vm03.stderr:set cache-mode for pool 'cache' to writeback 2026-03-31T20:22:37.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:366: test_tiering_1: ceph osd tier cache-mode cache proxy 2026-03-31T20:22:38.790 INFO:tasks.workunit.client.0.vm03.stderr:set cache-mode for pool 'cache' to proxy 2026-03-31T20:22:38.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:367: test_tiering_1: ceph osd tier cache-mode cache writeback 2026-03-31T20:22:39.789 INFO:tasks.workunit.client.0.vm03.stderr:set cache-mode for pool 'cache' to writeback 2026-03-31T20:22:39.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:368: test_tiering_1: expect_false ceph osd tier cache-mode cache none 2026-03-31T20:22:39.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:22:39.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd tier cache-mode cache none 2026-03-31T20:22:39.958 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: unable to set cache-mode 'none' on a 'writeback' pool; only 'proxy','readproxy' allowed. 2026-03-31T20:22:39.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:22:39.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:369: test_tiering_1: expect_false ceph osd tier cache-mode cache readonly --yes-i-really-mean-it 2026-03-31T20:22:39.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:22:39.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd tier cache-mode cache readonly --yes-i-really-mean-it 2026-03-31T20:22:40.125 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: unable to set cache-mode 'readonly' on a 'writeback' pool; only 'proxy','readproxy' allowed. 2026-03-31T20:22:40.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:22:40.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:372: test_tiering_1: rados -p cache put /etc/passwd /etc/passwd 2026-03-31T20:22:40.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:373: test_tiering_1: flush_pg_stats 2026-03-31T20:22:40.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2258: flush_pg_stats: local timeout=300 2026-03-31T20:22:40.158 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2260: flush_pg_stats: ceph osd ls 2026-03-31T20:22:40.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2260: flush_pg_stats: ids='0 2026-03-31T20:22:40.374 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-31T20:22:40.374 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-31T20:22:40.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2261: flush_pg_stats: seqs= 2026-03-31T20:22:40.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:22:40.374 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.0 flush_pg_stats 2026-03-31T20:22:40.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=34359738386 2026-03-31T20:22:40.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 34359738386 2026-03-31T20:22:40.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-34359738386' 2026-03-31T20:22:40.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:22:40.460 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.1 flush_pg_stats 2026-03-31T20:22:40.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=34359738386 2026-03-31T20:22:40.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 34359738386 2026-03-31T20:22:40.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-34359738386 1-34359738386' 2026-03-31T20:22:40.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:22:40.548 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.2 flush_pg_stats 2026-03-31T20:22:40.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=34359738386 2026-03-31T20:22:40.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 34359738386 2026-03-31T20:22:40.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-34359738386 1-34359738386 2-34359738386' 2026-03-31T20:22:40.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:22:40.635 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 0-34359738386 2026-03-31T20:22:40.635 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:22:40.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=0 2026-03-31T20:22:40.636 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 0-34359738386 2026-03-31T20:22:40.637 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:22:40.638 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 34359738386 2026-03-31T20:22:40.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=34359738386 2026-03-31T20:22:40.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.0 seq 34359738386' 2026-03-31T20:22:40.638 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-31T20:22:40.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738386 -lt 34359738386 2026-03-31T20:22:40.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:22:40.858 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 1-34359738386 2026-03-31T20:22:40.858 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:22:40.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=1 2026-03-31T20:22:40.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 1-34359738386 2026-03-31T20:22:40.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:22:40.861 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 34359738386 2026-03-31T20:22:40.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=34359738386 2026-03-31T20:22:40.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.1 seq 34359738386' 2026-03-31T20:22:40.861 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-31T20:22:41.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738385 -lt 34359738386 2026-03-31T20:22:41.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2275: flush_pg_stats: sleep 1 2026-03-31T20:22:42.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2276: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-31T20:22:42.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-31T20:22:42.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738385 -lt 34359738386 2026-03-31T20:22:42.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2275: flush_pg_stats: sleep 1 2026-03-31T20:22:43.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2276: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-31T20:22:43.300 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-31T20:22:43.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738386 -lt 34359738386 2026-03-31T20:22:43.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:22:43.507 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 2-34359738386 2026-03-31T20:22:43.507 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:22:43.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=2 2026-03-31T20:22:43.508 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 2-34359738386 2026-03-31T20:22:43.508 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:22:43.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=34359738386 2026-03-31T20:22:43.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.2 seq 34359738386' 2026-03-31T20:22:43.510 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 34359738386 2026-03-31T20:22:43.510 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-31T20:22:43.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738386 -lt 34359738386 2026-03-31T20:22:43.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:375: test_tiering_1: ceph osd tier cache-mode cache proxy 2026-03-31T20:22:44.819 INFO:tasks.workunit.client.0.vm03.stderr:set cache-mode for pool 'cache' to proxy 2026-03-31T20:22:44.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:376: test_tiering_1: expect_false ceph osd tier cache-mode cache none 2026-03-31T20:22:44.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:22:44.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd tier cache-mode cache none 2026-03-31T20:22:44.985 INFO:tasks.workunit.client.0.vm03.stderr:Error EBUSY: unable to set cache-mode 'none' on pool 'cache': dirty objects found 2026-03-31T20:22:44.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:22:44.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:377: test_tiering_1: expect_false ceph osd tier cache-mode cache readonly --yes-i-really-mean-it 2026-03-31T20:22:44.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:22:44.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd tier cache-mode cache readonly --yes-i-really-mean-it 2026-03-31T20:22:45.153 INFO:tasks.workunit.client.0.vm03.stderr:Error EBUSY: unable to set cache-mode 'readonly' on pool 'cache': dirty objects found 2026-03-31T20:22:45.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:22:45.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:378: test_tiering_1: ceph osd tier cache-mode cache writeback 2026-03-31T20:22:45.826 INFO:tasks.workunit.client.0.vm03.stderr:set cache-mode for pool 'cache' to writeback 2026-03-31T20:22:45.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:380: test_tiering_1: rados -p cache rm /etc/passwd 2026-03-31T20:22:45.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:381: test_tiering_1: rados -p cache cache-flush-evict-all 2026-03-31T20:22:45.886 INFO:tasks.workunit.client.0.vm03.stdout: /etc/passwd 2026-03-31T20:22:45.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:382: test_tiering_1: flush_pg_stats 2026-03-31T20:22:45.892 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2258: flush_pg_stats: local timeout=300 2026-03-31T20:22:45.892 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2260: flush_pg_stats: ceph osd ls 2026-03-31T20:22:46.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2260: flush_pg_stats: ids='0 2026-03-31T20:22:46.110 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-31T20:22:46.110 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-31T20:22:46.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2261: flush_pg_stats: seqs= 2026-03-31T20:22:46.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:22:46.111 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.0 flush_pg_stats 2026-03-31T20:22:46.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=34359738389 2026-03-31T20:22:46.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 34359738389 2026-03-31T20:22:46.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-34359738389' 2026-03-31T20:22:46.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:22:46.200 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.1 flush_pg_stats 2026-03-31T20:22:46.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=34359738389 2026-03-31T20:22:46.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 34359738389 2026-03-31T20:22:46.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-34359738389 1-34359738389' 2026-03-31T20:22:46.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:22:46.285 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.2 flush_pg_stats 2026-03-31T20:22:46.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=34359738389 2026-03-31T20:22:46.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 34359738389 2026-03-31T20:22:46.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-34359738389 1-34359738389 2-34359738389' 2026-03-31T20:22:46.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:22:46.391 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 0-34359738389 2026-03-31T20:22:46.391 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:22:46.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=0 2026-03-31T20:22:46.393 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 0-34359738389 2026-03-31T20:22:46.393 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:22:46.395 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 34359738389 2026-03-31T20:22:46.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=34359738389 2026-03-31T20:22:46.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.0 seq 34359738389' 2026-03-31T20:22:46.395 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-31T20:22:46.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738387 -lt 34359738389 2026-03-31T20:22:46.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2275: flush_pg_stats: sleep 1 2026-03-31T20:22:47.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2276: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-31T20:22:47.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-31T20:22:47.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738389 -lt 34359738389 2026-03-31T20:22:47.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:22:47.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 1-34359738389 2026-03-31T20:22:47.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:22:47.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=1 2026-03-31T20:22:47.829 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 1-34359738389 2026-03-31T20:22:47.829 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:22:47.830 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 34359738389 2026-03-31T20:22:47.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=34359738389 2026-03-31T20:22:47.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.1 seq 34359738389' 2026-03-31T20:22:47.830 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-31T20:22:48.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738389 -lt 34359738389 2026-03-31T20:22:48.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:22:48.041 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 2-34359738389 2026-03-31T20:22:48.041 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:22:48.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=2 2026-03-31T20:22:48.043 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 2-34359738389 2026-03-31T20:22:48.043 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:22:48.044 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 34359738389 2026-03-31T20:22:48.044 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=34359738389 2026-03-31T20:22:48.044 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.2 seq 34359738389' 2026-03-31T20:22:48.044 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-31T20:22:48.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738389 -lt 34359738389 2026-03-31T20:22:48.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:384: test_tiering_1: ceph osd tier cache-mode cache proxy 2026-03-31T20:22:49.035 INFO:tasks.workunit.client.0.vm03.stderr:set cache-mode for pool 'cache' to proxy 2026-03-31T20:22:49.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:385: test_tiering_1: ceph osd tier cache-mode cache none 2026-03-31T20:22:50.041 INFO:tasks.workunit.client.0.vm03.stderr:set cache-mode for pool 'cache' to none 2026-03-31T20:22:50.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:386: test_tiering_1: ceph osd tier cache-mode cache readonly --yes-i-really-mean-it 2026-03-31T20:22:51.048 INFO:tasks.workunit.client.0.vm03.stderr:set cache-mode for pool 'cache' to readonly 2026-03-31T20:22:51.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:387: test_tiering_1: TRIES=0 2026-03-31T20:22:51.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:388: test_tiering_1: ceph osd pool set cache pg_num 3 --yes-i-really-mean-it 2026-03-31T20:22:53.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:395: test_tiering_1: expect_false ceph osd pool set cache pg_num 4 2026-03-31T20:22:53.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:22:53.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set cache pg_num 4 2026-03-31T20:22:53.166 INFO:tasks.workunit.client.0.vm03.stderr:Error EPERM: splits in cache pools must be followed by scrubs and leave sufficient free space to avoid overfilling. use --yes-i-really-mean-it to force. 2026-03-31T20:22:53.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:22:53.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:396: test_tiering_1: ceph osd tier cache-mode cache none 2026-03-31T20:22:54.077 INFO:tasks.workunit.client.0.vm03.stderr:set cache-mode for pool 'cache' to none 2026-03-31T20:22:54.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:397: test_tiering_1: ceph osd tier set-overlay slow cache 2026-03-31T20:22:55.080 INFO:tasks.workunit.client.0.vm03.stderr:overlay for 'slow' is now (or already was) 'cache' 2026-03-31T20:22:55.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:398: test_tiering_1: expect_false ceph osd tier set-overlay slow cache2 2026-03-31T20:22:55.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:22:55.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd tier set-overlay slow cache2 2026-03-31T20:22:55.251 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: pool 'slow' has overlay 'cache'; please remove-overlay first 2026-03-31T20:22:55.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:22:55.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:399: test_tiering_1: expect_false ceph osd tier remove slow cache 2026-03-31T20:22:55.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:22:55.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd tier remove slow cache 2026-03-31T20:22:55.411 INFO:tasks.workunit.client.0.vm03.stderr:Error EBUSY: tier pool 'cache' is the overlay for 'slow'; please remove-overlay first 2026-03-31T20:22:55.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:22:55.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:400: test_tiering_1: ceph osd tier remove-overlay slow 2026-03-31T20:22:56.088 INFO:tasks.workunit.client.0.vm03.stderr:there is now (or already was) no overlay for 'slow' 2026-03-31T20:22:56.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:401: test_tiering_1: ceph osd tier set-overlay slow cache2 2026-03-31T20:22:57.109 INFO:tasks.workunit.client.0.vm03.stderr:overlay for 'slow' is now (or already was) 'cache2' 2026-03-31T20:22:57.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:402: test_tiering_1: ceph osd tier remove-overlay slow 2026-03-31T20:22:58.114 INFO:tasks.workunit.client.0.vm03.stderr:there is now (or already was) no overlay for 'slow' 2026-03-31T20:22:58.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:403: test_tiering_1: ceph osd tier remove slow cache 2026-03-31T20:22:59.134 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cache' is now (or already was) not a tier of 'slow' 2026-03-31T20:22:59.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:404: test_tiering_1: ceph osd tier add slow2 cache 2026-03-31T20:23:00.132 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cache' is now (or already was) a tier of 'slow2' 2026-03-31T20:23:00.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:405: test_tiering_1: expect_false ceph osd tier set-overlay slow cache 2026-03-31T20:23:00.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:23:00.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd tier set-overlay slow cache 2026-03-31T20:23:00.300 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: tier pool 'cache' is not a tier of 'slow' 2026-03-31T20:23:00.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:23:00.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:406: test_tiering_1: ceph osd tier set-overlay slow2 cache 2026-03-31T20:23:01.139 INFO:tasks.workunit.client.0.vm03.stderr:overlay for 'slow2' is now (or already was) 'cache' 2026-03-31T20:23:01.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:407: test_tiering_1: ceph osd tier remove-overlay slow2 2026-03-31T20:23:02.150 INFO:tasks.workunit.client.0.vm03.stderr:there is now (or already was) no overlay for 'slow2' 2026-03-31T20:23:02.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:408: test_tiering_1: ceph osd tier remove slow2 cache 2026-03-31T20:23:03.152 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cache' is now (or already was) not a tier of 'slow2' 2026-03-31T20:23:03.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:409: test_tiering_1: ceph osd tier remove slow cache2 2026-03-31T20:23:04.161 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cache2' is now (or already was) not a tier of 'slow' 2026-03-31T20:23:04.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:412: test_tiering_1: rados -p cache2 put /etc/passwd /etc/passwd 2026-03-31T20:23:04.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:413: test_tiering_1: ceph df 2026-03-31T20:23:04.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:413: test_tiering_1: grep cache2 2026-03-31T20:23:04.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:413: test_tiering_1: grep ' 1 ' 2026-03-31T20:23:04.467 INFO:tasks.workunit.client.0.vm03.stdout:waiting for pg stats to flush 2026-03-31T20:23:04.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:414: test_tiering_1: echo waiting for pg stats to flush 2026-03-31T20:23:04.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:415: test_tiering_1: sleep 2 2026-03-31T20:23:06.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:413: test_tiering_1: ceph df 2026-03-31T20:23:06.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:413: test_tiering_1: grep cache2 2026-03-31T20:23:06.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:413: test_tiering_1: grep ' 1 ' 2026-03-31T20:23:06.748 INFO:tasks.workunit.client.0.vm03.stdout:waiting for pg stats to flush 2026-03-31T20:23:06.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:414: test_tiering_1: echo waiting for pg stats to flush 2026-03-31T20:23:06.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:415: test_tiering_1: sleep 2 2026-03-31T20:23:08.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:413: test_tiering_1: ceph df 2026-03-31T20:23:08.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:413: test_tiering_1: grep cache2 2026-03-31T20:23:08.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:413: test_tiering_1: grep ' 1 ' 2026-03-31T20:23:09.035 INFO:tasks.workunit.client.0.vm03.stdout:waiting for pg stats to flush 2026-03-31T20:23:09.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:414: test_tiering_1: echo waiting for pg stats to flush 2026-03-31T20:23:09.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:415: test_tiering_1: sleep 2 2026-03-31T20:23:11.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:413: test_tiering_1: ceph df 2026-03-31T20:23:11.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:413: test_tiering_1: grep cache2 2026-03-31T20:23:11.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:413: test_tiering_1: grep ' 1 ' 2026-03-31T20:23:11.304 INFO:tasks.workunit.client.0.vm03.stdout:cache2 6 2 2.1 KiB 1 8 KiB 0 128 GiB 2026-03-31T20:23:11.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:417: test_tiering_1: expect_false ceph osd tier add slow cache2 2026-03-31T20:23:11.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:23:11.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd tier add slow cache2 2026-03-31T20:23:11.456 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOTEMPTY: tier pool 'cache2' is not empty; --force-nonempty to force 2026-03-31T20:23:11.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:23:11.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:418: test_tiering_1: ceph osd tier add slow cache2 --force-nonempty 2026-03-31T20:23:12.192 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cache2' is now (or already was) a tier of 'slow' 2026-03-31T20:23:12.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:419: test_tiering_1: ceph osd tier remove slow cache2 2026-03-31T20:23:13.203 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cache2' is now (or already was) not a tier of 'slow' 2026-03-31T20:23:13.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:421: test_tiering_1: ceph osd pool ls 2026-03-31T20:23:13.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:421: test_tiering_1: grep cache2 2026-03-31T20:23:13.417 INFO:tasks.workunit.client.0.vm03.stdout:cache2 2026-03-31T20:23:13.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:422: test_tiering_1: ceph osd pool ls -f json-pretty 2026-03-31T20:23:13.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:422: test_tiering_1: grep cache2 2026-03-31T20:23:13.629 INFO:tasks.workunit.client.0.vm03.stdout: "cache2" 2026-03-31T20:23:13.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:423: test_tiering_1: grep cache2 2026-03-31T20:23:13.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:423: test_tiering_1: ceph osd pool ls detail 2026-03-31T20:23:13.832 INFO:tasks.workunit.client.0.vm03.stdout:pool 6 'cache2' replicated size 2 min_size 1 crush_rule 0 object_hash rjenkins pg_num 2 pgp_num 2 autoscale_mode off last_change 55 lfor 47/47/47 flags hashpspool stripe_width 0 application rados read_balance_score 1.50 2026-03-31T20:23:13.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:424: test_tiering_1: ceph osd pool ls detail -f json-pretty 2026-03-31T20:23:13.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:424: test_tiering_1: grep cache2 2026-03-31T20:23:14.037 INFO:tasks.workunit.client.0.vm03.stdout: "pool_name": "cache2", 2026-03-31T20:23:14.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:426: test_tiering_1: ceph osd pool delete slow slow --yes-i-really-really-mean-it 2026-03-31T20:23:15.207 INFO:tasks.workunit.client.0.vm03.stderr:pool 'slow' does not exist 2026-03-31T20:23:15.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:427: test_tiering_1: ceph osd pool delete slow2 slow2 --yes-i-really-really-mean-it 2026-03-31T20:23:16.222 INFO:tasks.workunit.client.0.vm03.stderr:pool 'slow2' does not exist 2026-03-31T20:23:16.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:428: test_tiering_1: ceph osd pool delete cache cache --yes-i-really-really-mean-it 2026-03-31T20:23:17.230 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cache' does not exist 2026-03-31T20:23:17.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:429: test_tiering_1: ceph osd pool delete cache2 cache2 --yes-i-really-really-mean-it 2026-03-31T20:23:18.234 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cache2' does not exist 2026-03-31T20:23:18.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:23:18.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_tiering_2 2026-03-31T20:23:18.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:435: test_tiering_2: ceph osd pool create snap_base 2 2026-03-31T20:23:19.240 INFO:tasks.workunit.client.0.vm03.stderr:pool 'snap_base' already exists 2026-03-31T20:23:19.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:436: test_tiering_2: ceph osd pool application enable snap_base rados 2026-03-31T20:23:21.190 INFO:tasks.workunit.client.0.vm03.stderr:enabled application 'rados' on pool 'snap_base' 2026-03-31T20:23:21.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:437: test_tiering_2: ceph osd pool create snap_cache 2 2026-03-31T20:23:22.257 INFO:tasks.workunit.client.0.vm03.stderr:pool 'snap_cache' already exists 2026-03-31T20:23:22.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:438: test_tiering_2: ceph osd pool mksnap snap_cache snapname 2026-03-31T20:23:23.269 INFO:tasks.workunit.client.0.vm03.stderr:pool snap_cache snap snapname already exists 2026-03-31T20:23:23.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:439: test_tiering_2: expect_false ceph osd tier add snap_base snap_cache 2026-03-31T20:23:23.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:23:23.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd tier add snap_base snap_cache 2026-03-31T20:23:23.431 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOTEMPTY: tier pool 'snap_cache' has snapshot state; it cannot be added as a tier without breaking the pool 2026-03-31T20:23:23.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:23:23.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:440: test_tiering_2: ceph osd pool delete snap_base snap_base --yes-i-really-really-mean-it 2026-03-31T20:23:24.275 INFO:tasks.workunit.client.0.vm03.stderr:pool 'snap_base' does not exist 2026-03-31T20:23:24.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:441: test_tiering_2: ceph osd pool delete snap_cache snap_cache --yes-i-really-really-mean-it 2026-03-31T20:23:25.282 INFO:tasks.workunit.client.0.vm03.stderr:pool 'snap_cache' does not exist 2026-03-31T20:23:25.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:23:25.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_tiering_3 2026-03-31T20:23:25.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:447: test_tiering_3: ceph osd pool create basex 2 2026-03-31T20:23:26.292 INFO:tasks.workunit.client.0.vm03.stderr:pool 'basex' already exists 2026-03-31T20:23:26.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:448: test_tiering_3: ceph osd pool application enable basex rados 2026-03-31T20:23:28.267 INFO:tasks.workunit.client.0.vm03.stderr:enabled application 'rados' on pool 'basex' 2026-03-31T20:23:28.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:449: test_tiering_3: ceph osd pool create cachex 2 2026-03-31T20:23:29.325 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cachex' already exists 2026-03-31T20:23:29.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:450: test_tiering_3: ceph osd tier add basex cachex 2026-03-31T20:23:30.335 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cachex' is now (or already was) a tier of 'basex' 2026-03-31T20:23:30.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:451: test_tiering_3: expect_false ceph osd pool mksnap cache snapname 2026-03-31T20:23:30.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:23:30.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool mksnap cache snapname 2026-03-31T20:23:30.491 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: unrecognized pool 'cache' 2026-03-31T20:23:30.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:23:30.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:452: test_tiering_3: ceph osd tier remove basex cachex 2026-03-31T20:23:31.342 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cachex' is now (or already was) not a tier of 'basex' 2026-03-31T20:23:31.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:453: test_tiering_3: ceph osd pool delete basex basex --yes-i-really-really-mean-it 2026-03-31T20:23:32.345 INFO:tasks.workunit.client.0.vm03.stderr:pool 'basex' does not exist 2026-03-31T20:23:32.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:454: test_tiering_3: ceph osd pool delete cachex cachex --yes-i-really-really-mean-it 2026-03-31T20:23:33.353 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cachex' does not exist 2026-03-31T20:23:33.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:23:33.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_tiering_4 2026-03-31T20:23:33.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:460: test_tiering_4: ceph osd pool create eccache 2 2 erasure 2026-03-31T20:23:35.363 INFO:tasks.workunit.client.0.vm03.stderr:pool 'eccache' already exists 2026-03-31T20:23:35.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:461: test_tiering_4: expect_false ceph osd set-require-min-compat-client bobtail 2026-03-31T20:23:35.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:23:35.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd set-require-min-compat-client bobtail 2026-03-31T20:23:35.523 INFO:tasks.workunit.client.0.vm03.stderr:Error EPERM: osdmap current utilizes features that require jewel; cannot set require_min_compat_client below that to bobtail 2026-03-31T20:23:35.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:23:35.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:462: test_tiering_4: ceph osd pool create repbase 2 2026-03-31T20:23:36.376 INFO:tasks.workunit.client.0.vm03.stderr:pool 'repbase' already exists 2026-03-31T20:23:36.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:463: test_tiering_4: ceph osd pool application enable repbase rados 2026-03-31T20:23:38.332 INFO:tasks.workunit.client.0.vm03.stderr:enabled application 'rados' on pool 'repbase' 2026-03-31T20:23:38.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:464: test_tiering_4: expect_false ceph osd tier add repbase eccache 2026-03-31T20:23:38.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:23:38.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd tier add repbase eccache 2026-03-31T20:23:38.501 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOTSUP: tier pool 'eccache' is an ec pool, which cannot be a tier 2026-03-31T20:23:38.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:23:38.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:465: test_tiering_4: ceph osd pool delete repbase repbase --yes-i-really-really-mean-it 2026-03-31T20:23:39.394 INFO:tasks.workunit.client.0.vm03.stderr:pool 'repbase' does not exist 2026-03-31T20:23:39.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:466: test_tiering_4: ceph osd pool delete eccache eccache --yes-i-really-really-mean-it 2026-03-31T20:23:40.406 INFO:tasks.workunit.client.0.vm03.stderr:pool 'eccache' does not exist 2026-03-31T20:23:40.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:23:40.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_tiering_5 2026-03-31T20:23:40.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:472: test_tiering_5: ceph osd pool create slow 2 2026-03-31T20:23:41.411 INFO:tasks.workunit.client.0.vm03.stderr:pool 'slow' already exists 2026-03-31T20:23:41.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:473: test_tiering_5: ceph osd pool application enable slow rados 2026-03-31T20:23:43.376 INFO:tasks.workunit.client.0.vm03.stderr:enabled application 'rados' on pool 'slow' 2026-03-31T20:23:43.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:474: test_tiering_5: ceph osd pool create cache3 2 2026-03-31T20:23:44.436 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cache3' already exists 2026-03-31T20:23:44.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:475: test_tiering_5: ceph osd tier add-cache slow cache3 1024000 2026-03-31T20:23:45.451 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cache3' is now (or already was) a tier of 'slow' 2026-03-31T20:23:45.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:476: test_tiering_5: ceph osd dump 2026-03-31T20:23:45.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:476: test_tiering_5: grep cache3 2026-03-31T20:23:45.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:476: test_tiering_5: grep bloom 2026-03-31T20:23:45.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:476: test_tiering_5: grep '1200s x4' 2026-03-31T20:23:45.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:476: test_tiering_5: grep 'target_bytes 1024000' 2026-03-31T20:23:45.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:476: test_tiering_5: grep 'false_positive_probability: 0.05' 2026-03-31T20:23:45.666 INFO:tasks.workunit.client.0.vm03.stdout:pool 14 'cache3' replicated size 2 min_size 1 crush_rule 0 object_hash rjenkins pg_num 2 pgp_num 2 autoscale_mode off last_change 86 lfor 86/86/86 flags hashpspool,creating tier_of 13 cache_mode writeback target_bytes 1024000 hit_set bloom{false_positive_probability: 0.05, target_size: 0, seed: 0} 1200s x4 decay_rate 20 search_last_n 1 min_read_recency_for_promote 1 min_write_recency_for_promote 1 stripe_width 0 application rados read_balance_score 1.50 2026-03-31T20:23:45.666 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:477: test_tiering_5: ceph osd tier remove slow cache3 2026-03-31T20:23:45.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:477: test_tiering_5: true 2026-03-31T20:23:45.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:478: test_tiering_5: check_response 'EBUSY: tier pool '\''cache3'\'' is the overlay for '\''slow'\''; please remove-overlay first' 2026-03-31T20:23:45.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='EBUSY: tier pool '\''cache3'\'' is the overlay for '\''slow'\''; please remove-overlay first' 2026-03-31T20:23:45.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode= 2026-03-31T20:23:45.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode= 2026-03-31T20:23:45.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' '' -a '!=' ']' 2026-03-31T20:23:45.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'EBUSY: tier pool '\''cache3'\'' is the overlay for '\''slow'\''; please remove-overlay first' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:23:45.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:479: test_tiering_5: ceph osd tier remove-overlay slow 2026-03-31T20:23:46.449 INFO:tasks.workunit.client.0.vm03.stderr:there is now (or already was) no overlay for 'slow' 2026-03-31T20:23:46.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:480: test_tiering_5: ceph osd tier remove slow cache3 2026-03-31T20:23:47.460 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cache3' is now (or already was) not a tier of 'slow' 2026-03-31T20:23:47.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:481: test_tiering_5: ceph osd pool ls 2026-03-31T20:23:47.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:481: test_tiering_5: grep cache3 2026-03-31T20:23:47.675 INFO:tasks.workunit.client.0.vm03.stdout:cache3 2026-03-31T20:23:47.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:482: test_tiering_5: ceph osd pool delete cache3 cache3 --yes-i-really-really-mean-it 2026-03-31T20:23:48.464 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cache3' does not exist 2026-03-31T20:23:48.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:483: test_tiering_5: ceph osd pool ls 2026-03-31T20:23:48.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:483: test_tiering_5: grep cache3 2026-03-31T20:23:48.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:484: test_tiering_5: ceph osd pool delete slow slow --yes-i-really-really-mean-it 2026-03-31T20:23:49.468 INFO:tasks.workunit.client.0.vm03.stderr:pool 'slow' does not exist 2026-03-31T20:23:49.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:23:49.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_tiering_6 2026-03-31T20:23:49.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:490: test_tiering_6: ceph osd pool create datapool 2 2026-03-31T20:23:50.478 INFO:tasks.workunit.client.0.vm03.stderr:pool 'datapool' already exists 2026-03-31T20:23:50.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:491: test_tiering_6: ceph osd pool application enable datapool rados 2026-03-31T20:23:52.439 INFO:tasks.workunit.client.0.vm03.stderr:enabled application 'rados' on pool 'datapool' 2026-03-31T20:23:52.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:492: test_tiering_6: ceph osd pool create cachepool 2 2026-03-31T20:23:53.503 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cachepool' already exists 2026-03-31T20:23:53.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:493: test_tiering_6: ceph osd tier add-cache datapool cachepool 1024000 2026-03-31T20:23:54.511 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cachepool' is now (or already was) a tier of 'datapool' 2026-03-31T20:23:54.522 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:494: test_tiering_6: ceph osd tier cache-mode cachepool writeback 2026-03-31T20:23:54.733 INFO:tasks.workunit.client.0.vm03.stderr:set cache-mode for pool 'cachepool' to writeback 2026-03-31T20:23:54.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:495: test_tiering_6: rados -p datapool put object /etc/passwd 2026-03-31T20:23:54.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:496: test_tiering_6: rados -p cachepool stat object 2026-03-31T20:23:54.786 INFO:tasks.workunit.client.0.vm03.stdout:cachepool/object mtime 2026-03-31T20:23:54.000000+0000, size 2163 2026-03-31T20:23:54.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:497: test_tiering_6: rados -p cachepool cache-flush object 2026-03-31T20:23:54.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:498: test_tiering_6: rados -p datapool stat object 2026-03-31T20:23:54.827 INFO:tasks.workunit.client.0.vm03.stdout:datapool/object mtime 2026-03-31T20:23:54.000000+0000, size 2163 2026-03-31T20:23:54.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:499: test_tiering_6: ceph osd tier remove-overlay datapool 2026-03-31T20:23:55.516 INFO:tasks.workunit.client.0.vm03.stderr:there is now (or already was) no overlay for 'datapool' 2026-03-31T20:23:55.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:500: test_tiering_6: ceph osd tier remove datapool cachepool 2026-03-31T20:23:56.528 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cachepool' is now (or already was) not a tier of 'datapool' 2026-03-31T20:23:56.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:501: test_tiering_6: ceph osd pool delete cachepool cachepool --yes-i-really-really-mean-it 2026-03-31T20:23:57.534 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cachepool' does not exist 2026-03-31T20:23:57.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:502: test_tiering_6: ceph osd pool delete datapool datapool --yes-i-really-really-mean-it 2026-03-31T20:23:58.557 INFO:tasks.workunit.client.0.vm03.stderr:pool 'datapool' does not exist 2026-03-31T20:23:58.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:23:58.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_tiering_7 2026-03-31T20:23:58.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:508: test_tiering_7: ceph osd pool create datapool 2 2026-03-31T20:23:59.560 INFO:tasks.workunit.client.0.vm03.stderr:pool 'datapool' already exists 2026-03-31T20:23:59.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:509: test_tiering_7: ceph osd pool application enable datapool rados 2026-03-31T20:24:01.518 INFO:tasks.workunit.client.0.vm03.stderr:enabled application 'rados' on pool 'datapool' 2026-03-31T20:24:01.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:510: test_tiering_7: ceph osd pool create cachepool 2 2026-03-31T20:24:02.580 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cachepool' already exists 2026-03-31T20:24:02.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:511: test_tiering_7: ceph osd tier add-cache datapool cachepool 1024000 2026-03-31T20:24:03.589 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cachepool' is now (or already was) a tier of 'datapool' 2026-03-31T20:24:03.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:512: test_tiering_7: ceph osd pool delete cachepool cachepool --yes-i-really-really-mean-it 2026-03-31T20:24:03.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:512: test_tiering_7: true 2026-03-31T20:24:03.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:513: test_tiering_7: check_response 'EBUSY: pool '\''cachepool'\'' is a tier of '\''datapool'\''' 2026-03-31T20:24:03.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='EBUSY: pool '\''cachepool'\'' is a tier of '\''datapool'\''' 2026-03-31T20:24:03.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode= 2026-03-31T20:24:03.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode= 2026-03-31T20:24:03.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' '' -a '!=' ']' 2026-03-31T20:24:03.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'EBUSY: pool '\''cachepool'\'' is a tier of '\''datapool'\''' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:24:03.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:514: test_tiering_7: ceph osd pool delete datapool datapool --yes-i-really-really-mean-it 2026-03-31T20:24:03.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:514: test_tiering_7: true 2026-03-31T20:24:03.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:515: test_tiering_7: check_response 'EBUSY: pool '\''datapool'\'' has tiers cachepool' 2026-03-31T20:24:03.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='EBUSY: pool '\''datapool'\'' has tiers cachepool' 2026-03-31T20:24:03.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode= 2026-03-31T20:24:03.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode= 2026-03-31T20:24:03.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' '' -a '!=' ']' 2026-03-31T20:24:03.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'EBUSY: pool '\''datapool'\'' has tiers cachepool' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:24:03.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:516: test_tiering_7: ceph osd tier remove-overlay datapool 2026-03-31T20:24:04.591 INFO:tasks.workunit.client.0.vm03.stderr:there is now (or already was) no overlay for 'datapool' 2026-03-31T20:24:04.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:517: test_tiering_7: ceph osd tier remove datapool cachepool 2026-03-31T20:24:05.604 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cachepool' is now (or already was) not a tier of 'datapool' 2026-03-31T20:24:05.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:518: test_tiering_7: ceph osd pool delete cachepool cachepool --yes-i-really-really-mean-it 2026-03-31T20:24:06.606 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cachepool' does not exist 2026-03-31T20:24:06.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:519: test_tiering_7: ceph osd pool delete datapool datapool --yes-i-really-really-mean-it 2026-03-31T20:24:07.612 INFO:tasks.workunit.client.0.vm03.stderr:pool 'datapool' does not exist 2026-03-31T20:24:07.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:24:07.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_tiering_8 2026-03-31T20:24:07.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:525: test_tiering_8: ceph osd set notieragent 2026-03-31T20:24:09.572 INFO:tasks.workunit.client.0.vm03.stderr:notieragent is set 2026-03-31T20:24:09.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:526: test_tiering_8: ceph osd pool create datapool 2 2026-03-31T20:24:10.634 INFO:tasks.workunit.client.0.vm03.stderr:pool 'datapool' already exists 2026-03-31T20:24:10.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:527: test_tiering_8: ceph osd pool application enable datapool rados 2026-03-31T20:24:12.592 INFO:tasks.workunit.client.0.vm03.stderr:enabled application 'rados' on pool 'datapool' 2026-03-31T20:24:12.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:528: test_tiering_8: ceph osd pool create cache4 2 2026-03-31T20:24:13.650 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cache4' already exists 2026-03-31T20:24:13.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:529: test_tiering_8: ceph osd tier add-cache datapool cache4 1024000 2026-03-31T20:24:14.665 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cache4' is now (or already was) a tier of 'datapool' 2026-03-31T20:24:14.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:530: test_tiering_8: ceph osd tier cache-mode cache4 writeback 2026-03-31T20:24:14.882 INFO:tasks.workunit.client.0.vm03.stderr:set cache-mode for pool 'cache4' to writeback 2026-03-31T20:24:14.894 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:531: test_tiering_8: mktemp 2026-03-31T20:24:14.894 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:531: test_tiering_8: grep tmp 2026-03-31T20:24:14.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:531: test_tiering_8: tmpfile=/tmp/tmp.QPb3gW1ba6 2026-03-31T20:24:14.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:532: test_tiering_8: dd if=/dev/zero of=/tmp/tmp.QPb3gW1ba6 bs=4K count=1 2026-03-31T20:24:14.896 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records in 2026-03-31T20:24:14.896 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records out 2026-03-31T20:24:14.896 INFO:tasks.workunit.client.0.vm03.stderr:4096 bytes (4.1 kB, 4.0 KiB) copied, 8.7404e-05 s, 46.9 MB/s 2026-03-31T20:24:14.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:533: test_tiering_8: ceph osd pool set cache4 target_max_objects 200 2026-03-31T20:24:16.619 INFO:tasks.workunit.client.0.vm03.stderr:set pool 20 target_max_objects to 200 2026-03-31T20:24:16.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:534: test_tiering_8: ceph osd pool set cache4 target_max_bytes 1000000 2026-03-31T20:24:18.637 INFO:tasks.workunit.client.0.vm03.stderr:set pool 20 target_max_bytes to 1000000 2026-03-31T20:24:18.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:535: test_tiering_8: rados -p cache4 put foo1 /tmp/tmp.QPb3gW1ba6 2026-03-31T20:24:18.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:536: test_tiering_8: rados -p cache4 put foo2 /tmp/tmp.QPb3gW1ba6 2026-03-31T20:24:18.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:537: test_tiering_8: rm -f /tmp/tmp.QPb3gW1ba6 2026-03-31T20:24:18.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:538: test_tiering_8: flush_pg_stats 2026-03-31T20:24:18.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2258: flush_pg_stats: local timeout=300 2026-03-31T20:24:18.697 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2260: flush_pg_stats: ceph osd ls 2026-03-31T20:24:18.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2260: flush_pg_stats: ids='0 2026-03-31T20:24:18.906 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-31T20:24:18.906 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-31T20:24:18.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2261: flush_pg_stats: seqs= 2026-03-31T20:24:18.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:24:18.907 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.0 flush_pg_stats 2026-03-31T20:24:18.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=34359738410 2026-03-31T20:24:18.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 34359738410 2026-03-31T20:24:18.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-34359738410' 2026-03-31T20:24:18.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:24:18.988 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.1 flush_pg_stats 2026-03-31T20:24:19.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=34359738410 2026-03-31T20:24:19.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 34359738410 2026-03-31T20:24:19.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-34359738410 1-34359738410' 2026-03-31T20:24:19.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:24:19.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.2 flush_pg_stats 2026-03-31T20:24:19.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=34359738410 2026-03-31T20:24:19.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 34359738410 2026-03-31T20:24:19.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-34359738410 1-34359738410 2-34359738410' 2026-03-31T20:24:19.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:24:19.154 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 0-34359738410 2026-03-31T20:24:19.154 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:24:19.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=0 2026-03-31T20:24:19.155 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 0-34359738410 2026-03-31T20:24:19.155 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:24:19.156 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 34359738410 2026-03-31T20:24:19.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=34359738410 2026-03-31T20:24:19.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.0 seq 34359738410' 2026-03-31T20:24:19.156 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-31T20:24:19.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738408 -lt 34359738410 2026-03-31T20:24:19.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2275: flush_pg_stats: sleep 1 2026-03-31T20:24:20.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2276: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-31T20:24:20.359 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-31T20:24:20.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738408 -lt 34359738410 2026-03-31T20:24:20.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2275: flush_pg_stats: sleep 1 2026-03-31T20:24:21.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2276: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-31T20:24:21.562 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-31T20:24:21.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738410 -lt 34359738410 2026-03-31T20:24:21.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:24:21.767 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 1-34359738410 2026-03-31T20:24:21.767 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:24:21.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=1 2026-03-31T20:24:21.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 1-34359738410 2026-03-31T20:24:21.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:24:21.770 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 34359738410 2026-03-31T20:24:21.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=34359738410 2026-03-31T20:24:21.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.1 seq 34359738410' 2026-03-31T20:24:21.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-31T20:24:21.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738410 -lt 34359738410 2026-03-31T20:24:21.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:24:21.972 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 2-34359738410 2026-03-31T20:24:21.972 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:24:21.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=2 2026-03-31T20:24:21.974 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 2-34359738410 2026-03-31T20:24:21.974 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:24:21.975 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 34359738410 2026-03-31T20:24:21.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=34359738410 2026-03-31T20:24:21.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.2 seq 34359738410' 2026-03-31T20:24:21.975 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-31T20:24:22.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738410 -lt 34359738410 2026-03-31T20:24:22.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:539: test_tiering_8: ceph df 2026-03-31T20:24:22.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:539: test_tiering_8: grep datapool 2026-03-31T20:24:22.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:539: test_tiering_8: grep ' 2 ' 2026-03-31T20:24:22.457 INFO:tasks.workunit.client.0.vm03.stdout:datapool 19 2 8 KiB 2 16 KiB 0 128 GiB 2026-03-31T20:24:22.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:540: test_tiering_8: ceph osd tier remove-overlay datapool 2026-03-31T20:24:22.707 INFO:tasks.workunit.client.0.vm03.stderr:there is now (or already was) no overlay for 'datapool' 2026-03-31T20:24:22.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:541: test_tiering_8: ceph osd tier remove datapool cache4 2026-03-31T20:24:23.716 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cache4' is now (or already was) not a tier of 'datapool' 2026-03-31T20:24:23.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:542: test_tiering_8: ceph osd pool delete cache4 cache4 --yes-i-really-really-mean-it 2026-03-31T20:24:24.719 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cache4' does not exist 2026-03-31T20:24:24.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:543: test_tiering_8: ceph osd pool delete datapool datapool --yes-i-really-really-mean-it 2026-03-31T20:24:25.727 INFO:tasks.workunit.client.0.vm03.stderr:pool 'datapool' does not exist 2026-03-31T20:24:25.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:544: test_tiering_8: ceph osd unset notieragent 2026-03-31T20:24:27.683 INFO:tasks.workunit.client.0.vm03.stderr:notieragent is unset 2026-03-31T20:24:27.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:24:27.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_tiering_9 2026-03-31T20:24:27.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:553: test_tiering_9: ceph osd pool create basepoolA 2 2026-03-31T20:24:28.748 INFO:tasks.workunit.client.0.vm03.stderr:pool 'basepoolA' already exists 2026-03-31T20:24:28.760 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:554: test_tiering_9: ceph osd pool application enable basepoolA rados 2026-03-31T20:24:30.699 INFO:tasks.workunit.client.0.vm03.stderr:enabled application 'rados' on pool 'basepoolA' 2026-03-31T20:24:30.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:555: test_tiering_9: ceph osd pool create basepoolB 2 2026-03-31T20:24:31.763 INFO:tasks.workunit.client.0.vm03.stderr:pool 'basepoolB' already exists 2026-03-31T20:24:31.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:556: test_tiering_9: ceph osd pool application enable basepoolB rados 2026-03-31T20:24:33.724 INFO:tasks.workunit.client.0.vm03.stderr:enabled application 'rados' on pool 'basepoolB' 2026-03-31T20:24:33.740 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:557: test_tiering_9: ceph osd dump 2026-03-31T20:24:33.740 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:557: test_tiering_9: grep 'pool.*basepoolA' 2026-03-31T20:24:33.740 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:557: test_tiering_9: awk '{print $2;}' 2026-03-31T20:24:33.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:557: test_tiering_9: poolA_id=21 2026-03-31T20:24:33.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:558: test_tiering_9: ceph osd dump 2026-03-31T20:24:33.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:558: test_tiering_9: grep 'pool.*basepoolB' 2026-03-31T20:24:33.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:558: test_tiering_9: awk '{print $2;}' 2026-03-31T20:24:34.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:558: test_tiering_9: poolB_id=22 2026-03-31T20:24:34.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:560: test_tiering_9: ceph osd pool create cache5 2 2026-03-31T20:24:34.790 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cache5' already exists 2026-03-31T20:24:34.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:561: test_tiering_9: ceph osd pool create cache6 2 2026-03-31T20:24:35.797 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cache6' already exists 2026-03-31T20:24:35.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:562: test_tiering_9: ceph osd tier add basepoolA cache5 2026-03-31T20:24:36.806 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cache5' is now (or already was) a tier of 'basepoolA' 2026-03-31T20:24:36.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:563: test_tiering_9: ceph osd tier add basepoolB cache6 2026-03-31T20:24:37.812 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cache6' is now (or already was) a tier of 'basepoolB' 2026-03-31T20:24:37.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:564: test_tiering_9: ceph osd tier remove basepoolB cache5 2026-03-31T20:24:37.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:564: test_tiering_9: grep 'not a tier of' 2026-03-31T20:24:38.043 INFO:tasks.workunit.client.0.vm03.stdout:pool 'cache5' is now (or already was) not a tier of 'basepoolB' 2026-03-31T20:24:38.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:565: test_tiering_9: ceph osd dump 2026-03-31T20:24:38.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:565: test_tiering_9: grep 'pool.*'\''cache5'\''' 2026-03-31T20:24:38.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:565: test_tiering_9: grep 'tier_of[ \t]\+21' 2026-03-31T20:24:38.249 INFO:tasks.workunit.client.0.vm03.stdout:pool 23 'cache5' replicated size 2 min_size 1 crush_rule 0 object_hash rjenkins pg_num 2 pgp_num 2 autoscale_mode off last_change 134 flags hashpspool tier_of 21 stripe_width 0 application rados read_balance_score 1.49 2026-03-31T20:24:38.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:566: test_tiering_9: ceph osd tier remove basepoolA cache6 2026-03-31T20:24:38.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:566: test_tiering_9: grep 'not a tier of' 2026-03-31T20:24:38.466 INFO:tasks.workunit.client.0.vm03.stdout:pool 'cache6' is now (or already was) not a tier of 'basepoolA' 2026-03-31T20:24:38.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:567: test_tiering_9: ceph osd dump 2026-03-31T20:24:38.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:567: test_tiering_9: grep 'pool.*'\''cache6'\''' 2026-03-31T20:24:38.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:567: test_tiering_9: grep 'tier_of[ \t]\+22' 2026-03-31T20:24:38.677 INFO:tasks.workunit.client.0.vm03.stdout:pool 24 'cache6' replicated size 2 min_size 1 crush_rule 0 object_hash rjenkins pg_num 2 pgp_num 2 autoscale_mode off last_change 135 flags hashpspool tier_of 22 stripe_width 0 application rados read_balance_score 2.99 2026-03-31T20:24:38.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:569: test_tiering_9: ceph osd tier remove basepoolA cache5 2026-03-31T20:24:38.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:569: test_tiering_9: grep 'not a tier of' 2026-03-31T20:24:39.844 INFO:tasks.workunit.client.0.vm03.stdout:pool 'cache5' is now (or already was) not a tier of 'basepoolA' 2026-03-31T20:24:39.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:570: test_tiering_9: ceph osd dump 2026-03-31T20:24:39.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:570: test_tiering_9: grep 'pool.*'\''cache5'\''' 2026-03-31T20:24:39.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:570: test_tiering_9: grep tier_of 2026-03-31T20:24:40.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:571: test_tiering_9: ceph osd tier remove basepoolB cache6 2026-03-31T20:24:40.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:571: test_tiering_9: grep 'not a tier of' 2026-03-31T20:24:40.851 INFO:tasks.workunit.client.0.vm03.stdout:pool 'cache6' is now (or already was) not a tier of 'basepoolB' 2026-03-31T20:24:40.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:572: test_tiering_9: ceph osd dump 2026-03-31T20:24:40.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:572: test_tiering_9: grep 'pool.*'\''cache6'\''' 2026-03-31T20:24:40.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:572: test_tiering_9: grep tier_of 2026-03-31T20:24:41.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:574: test_tiering_9: ceph osd dump 2026-03-31T20:24:41.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:574: test_tiering_9: grep 'pool.*'\''basepoolA'\''' 2026-03-31T20:24:41.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:574: test_tiering_9: grep tiers 2026-03-31T20:24:41.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:575: test_tiering_9: ceph osd dump 2026-03-31T20:24:41.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:575: test_tiering_9: grep 'pool.*'\''basepoolB'\''' 2026-03-31T20:24:41.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:575: test_tiering_9: grep tiers 2026-03-31T20:24:41.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:577: test_tiering_9: ceph osd pool delete cache6 cache6 --yes-i-really-really-mean-it 2026-03-31T20:24:41.840 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cache6' does not exist 2026-03-31T20:24:41.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:578: test_tiering_9: ceph osd pool delete cache5 cache5 --yes-i-really-really-mean-it 2026-03-31T20:24:42.849 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cache5' does not exist 2026-03-31T20:24:42.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:579: test_tiering_9: ceph osd pool delete basepoolB basepoolB --yes-i-really-really-mean-it 2026-03-31T20:24:43.851 INFO:tasks.workunit.client.0.vm03.stderr:pool 'basepoolB' does not exist 2026-03-31T20:24:43.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:580: test_tiering_9: ceph osd pool delete basepoolA basepoolA --yes-i-really-really-mean-it 2026-03-31T20:24:44.866 INFO:tasks.workunit.client.0.vm03.stderr:pool 'basepoolA' does not exist 2026-03-31T20:24:44.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:24:45.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_auth 2026-03-31T20:24:45.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:585: test_auth: expect_false ceph auth add client.xx mon invalid osd 'allow *' 2026-03-31T20:24:45.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:24:45.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph auth add client.xx mon invalid osd 'allow *' 2026-03-31T20:24:45.263 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: mon capability parse failed, stopped at 'invalid' of 'invalid' 2026-03-31T20:24:45.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:24:45.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:586: test_auth: expect_false ceph auth add client.xx mon 'allow *' osd 'allow *' invalid 'allow *' 2026-03-31T20:24:45.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:24:45.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph auth add client.xx mon 'allow *' osd 'allow *' invalid 'allow *' 2026-03-31T20:24:45.443 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: unknown cap type 'invalid' 2026-03-31T20:24:45.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:24:45.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:587: test_auth: ceph auth add client.xx mon 'allow *' osd 'allow *' 2026-03-31T20:24:45.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:588: test_auth: ceph auth export client.xx 2026-03-31T20:24:45.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:589: test_auth: ceph auth add client.xx -i client.xx.keyring 2026-03-31T20:24:46.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:590: test_auth: rm -f client.xx.keyring 2026-03-31T20:24:46.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:591: test_auth: ceph auth list 2026-03-31T20:24:46.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:591: test_auth: grep client.xx 2026-03-31T20:24:46.501 INFO:tasks.workunit.client.0.vm03.stdout:client.xx 2026-03-31T20:24:46.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:592: test_auth: ceph auth ls 2026-03-31T20:24:46.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:592: test_auth: grep client.xx 2026-03-31T20:24:46.767 INFO:tasks.workunit.client.0.vm03.stdout:client.xx 2026-03-31T20:24:46.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:593: test_auth: ceph auth get client.xx 2026-03-31T20:24:46.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:593: test_auth: grep caps 2026-03-31T20:24:46.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:593: test_auth: grep mon 2026-03-31T20:24:47.035 INFO:tasks.workunit.client.0.vm03.stdout: caps mon = "allow *" 2026-03-31T20:24:47.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:594: test_auth: ceph auth get client.xx 2026-03-31T20:24:47.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:594: test_auth: grep caps 2026-03-31T20:24:47.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:594: test_auth: grep osd 2026-03-31T20:24:47.294 INFO:tasks.workunit.client.0.vm03.stdout: caps osd = "allow *" 2026-03-31T20:24:47.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:595: test_auth: ceph auth get-key client.xx 2026-03-31T20:24:47.557 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:596: test_auth: ceph auth print-key client.xx 2026-03-31T20:24:47.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:597: test_auth: ceph auth print_key client.xx 2026-03-31T20:24:48.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:598: test_auth: ceph auth caps client.xx osd 'allow rw' 2026-03-31T20:24:48.346 INFO:tasks.workunit.client.0.vm03.stdout:AQCNLcxp7d/qJBAAUE+XSEGK1PVBta1vz1SpbQ==AQCNLcxp7d/qJBAAUE+XSEGK1PVBta1vz1SpbQ==AQCNLcxp7d/qJBAAUE+XSEGK1PVBta1vz1SpbQ==[client.xx] 2026-03-31T20:24:48.346 INFO:tasks.workunit.client.0.vm03.stdout: key = AQCNLcxp7d/qJBAAUE+XSEGK1PVBta1vz1SpbQ== 2026-03-31T20:24:48.346 INFO:tasks.workunit.client.0.vm03.stdout: caps osd = "allow rw" 2026-03-31T20:24:48.346 INFO:tasks.workunit.client.0.vm03.stderr:updated caps for client.xx 2026-03-31T20:24:48.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:599: test_auth: expect_false sh 2026-03-31T20:24:48.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:24:48.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: sh 2026-03-31T20:24:48.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:24:48.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:600: test_auth: ceph auth get client.xx 2026-03-31T20:24:48.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:600: test_auth: grep osd 2026-03-31T20:24:48.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:600: test_auth: grep 'allow rw' 2026-03-31T20:24:48.901 INFO:tasks.workunit.client.0.vm03.stdout: caps osd = "allow rw" 2026-03-31T20:24:48.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:601: test_auth: ceph auth caps client.xx mon 'allow command "osd tree"' 2026-03-31T20:24:49.164 INFO:tasks.workunit.client.0.vm03.stdout:[client.xx] 2026-03-31T20:24:49.164 INFO:tasks.workunit.client.0.vm03.stdout: key = AQCNLcxp7d/qJBAAUE+XSEGK1PVBta1vz1SpbQ== 2026-03-31T20:24:49.164 INFO:tasks.workunit.client.0.vm03.stdout: caps mon = "allow command \"osd tree\"" 2026-03-31T20:24:49.164 INFO:tasks.workunit.client.0.vm03.stderr:updated caps for client.xx 2026-03-31T20:24:49.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:602: test_auth: ceph auth export 2026-03-31T20:24:49.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:602: test_auth: grep client.xx 2026-03-31T20:24:49.438 INFO:tasks.workunit.client.0.vm03.stdout:[client.xx] 2026-03-31T20:24:49.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:603: test_auth: ceph auth export -o authfile 2026-03-31T20:24:49.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:604: test_auth: ceph auth import -i authfile 2026-03-31T20:24:49.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:606: test_auth: ceph auth export -o authfile2 2026-03-31T20:24:50.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:607: test_auth: diff authfile authfile2 2026-03-31T20:24:50.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:608: test_auth: rm authfile authfile2 2026-03-31T20:24:50.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:609: test_auth: ceph auth del client.xx 2026-03-31T20:24:50.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:610: test_auth: expect_false ceph auth get client.xx 2026-03-31T20:24:50.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:24:50.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph auth get client.xx 2026-03-31T20:24:50.675 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: failed to find client.xx in keyring 2026-03-31T20:24:50.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:24:50.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:613: test_auth: ceph auth get-or-create client.admin2 mon 'allow *' 2026-03-31T20:24:50.934 INFO:tasks.workunit.client.0.vm03.stdout:[client.admin2] 2026-03-31T20:24:50.934 INFO:tasks.workunit.client.0.vm03.stdout: key = AQCSLcxpk0HFMhAAzoStLy3g7c0AFuMq1qlwcw== 2026-03-31T20:24:50.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:614: test_auth: ceph auth get client.admin2 2026-03-31T20:24:51.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:615: test_auth: env CEPH_KEYRING=keyring1 ceph -n client.admin2 auth get client.admin2 2026-03-31T20:24:51.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:617: test_auth: expect_true diff -au keyring1 keyring2 2026-03-31T20:24:51.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:24:51.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: diff -au keyring1 keyring2 2026-03-31T20:24:51.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:24:51.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:619: test_auth: env CEPH_KEYRING=keyring1 ceph -n client.admin2 auth rotate client.admin2 2026-03-31T20:24:51.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:621: test_auth: diff -au keyring1 keyring3 2026-03-31T20:24:51.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:621: test_auth: grep -E '^[-+][^-+]' 2026-03-31T20:24:51.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:621: test_auth: expect_false grep -v key 2026-03-31T20:24:51.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:24:51.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep -v key 2026-03-31T20:24:51.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:24:51.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:623: test_auth: expect_false env CEPH_KEYRING=keyring1 ceph -n client.admin2 auth get client.admin2 2026-03-31T20:24:51.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:24:51.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: env CEPH_KEYRING=keyring1 ceph -n client.admin2 auth get client.admin2 2026-03-31T20:24:51.798 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:24:51.792+0000 7f9fea1a9640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [2] 2026-03-31T20:24:51.799 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:24:51.792+0000 7f9fea9aa640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [2] 2026-03-31T20:24:51.799 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:24:51.792+0000 7f9fe99a8640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [2] 2026-03-31T20:24:51.799 INFO:tasks.workunit.client.0.vm03.stderr:[errno 13] RADOS permission denied (error connecting to the cluster) 2026-03-31T20:24:51.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:24:51.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:625: test_auth: expect_true env CEPH_KEYRING=keyring3 ceph -n client.admin2 auth get client.admin2 2026-03-31T20:24:51.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:24:51.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: env CEPH_KEYRING=keyring3 ceph -n client.admin2 auth get client.admin2 2026-03-31T20:24:52.058 INFO:tasks.workunit.client.0.vm03.stdout:[client.admin2] 2026-03-31T20:24:52.059 INFO:tasks.workunit.client.0.vm03.stdout: key = AQCTLcxpUzhNKxAAIGb8Uwg3gzp7fz0rjm4j4g== 2026-03-31T20:24:52.059 INFO:tasks.workunit.client.0.vm03.stdout: caps mon = "allow *" 2026-03-31T20:24:52.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:24:52.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:627: test_auth: expect_true ceph auth get client.admin2 2026-03-31T20:24:52.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:24:52.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: ceph auth get client.admin2 2026-03-31T20:24:52.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:24:52.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:628: test_auth: expect_true diff -au keyring3 keyring4 2026-03-31T20:24:52.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:24:52.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: diff -au keyring3 keyring4 2026-03-31T20:24:52.332 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:24:52.332 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:629: test_auth: expect_true ceph auth rm client.admin2 2026-03-31T20:24:52.332 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:24:52.332 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: ceph auth rm client.admin2 2026-03-31T20:24:52.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:24:52.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:630: test_auth: rm keyring1 keyring2 keyring3 keyring4 2026-03-31T20:24:52.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:633: test_auth: echo -e 'auth add client.xx mon "allow *" osd "allow *"\n' 2026-03-31T20:24:52.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:633: test_auth: ceph 2026-03-31T20:24:52.785 INFO:tasks.workunit.client.0.vm03.stderr:added key for client.xx 2026-03-31T20:24:52.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:634: test_auth: ceph auth get client.xx 2026-03-31T20:24:53.060 INFO:tasks.workunit.client.0.vm03.stdout:[client.xx] 2026-03-31T20:24:53.060 INFO:tasks.workunit.client.0.vm03.stdout: key = AQCULcxp0a+CLhAAUVR7+mX2yKVswCjLZN2hUQ== 2026-03-31T20:24:53.060 INFO:tasks.workunit.client.0.vm03.stdout: caps mon = "allow *" 2026-03-31T20:24:53.060 INFO:tasks.workunit.client.0.vm03.stdout: caps osd = "allow *" 2026-03-31T20:24:53.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:636: test_auth: echo 'auth del client.xx' 2026-03-31T20:24:53.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:636: test_auth: ceph 2026-03-31T20:24:53.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:637: test_auth: expect_false ceph auth get client.xx 2026-03-31T20:24:53.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:24:53.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph auth get client.xx 2026-03-31T20:24:53.440 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: failed to find client.xx in keyring 2026-03-31T20:24:53.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:24:53.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:24:53.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_auth_profiles 2026-03-31T20:24:53.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:642: test_auth_profiles: ceph auth add client.xx-profile-ro mon 'allow profile read-only' mgr 'allow profile read-only' 2026-03-31T20:24:53.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:644: test_auth_profiles: ceph auth add client.xx-profile-rw mon 'allow profile read-write' mgr 'allow profile read-write' 2026-03-31T20:24:54.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:646: test_auth_profiles: ceph auth add client.xx-profile-rd mon 'allow profile role-definer' 2026-03-31T20:24:54.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:648: test_auth_profiles: ceph auth export 2026-03-31T20:24:54.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:651: test_auth_profiles: ceph -n client.xx-profile-ro -k client.xx.keyring status 2026-03-31T20:24:54.981 INFO:tasks.workunit.client.0.vm03.stdout: cluster: 2026-03-31T20:24:54.981 INFO:tasks.workunit.client.0.vm03.stdout: id: a4a0ca01-ae82-443e-a7c7-50605716689a 2026-03-31T20:24:54.981 INFO:tasks.workunit.client.0.vm03.stdout: health: HEALTH_OK 2026-03-31T20:24:54.981 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:24:54.981 INFO:tasks.workunit.client.0.vm03.stdout: services: 2026-03-31T20:24:54.981 INFO:tasks.workunit.client.0.vm03.stdout: mon: 3 daemons, quorum a,b,c (age 3m) [leader: a] 2026-03-31T20:24:54.981 INFO:tasks.workunit.client.0.vm03.stdout: mgr: x(active, since 3m) 2026-03-31T20:24:54.981 INFO:tasks.workunit.client.0.vm03.stdout: osd: 3 osds: 3 up (since 3m), 3 in (since 3m) 2026-03-31T20:24:54.981 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:24:54.981 INFO:tasks.workunit.client.0.vm03.stdout: data: 2026-03-31T20:24:54.981 INFO:tasks.workunit.client.0.vm03.stdout: pools: 2 pools, 9 pgs 2026-03-31T20:24:54.981 INFO:tasks.workunit.client.0.vm03.stdout: objects: 4 objects, 449 KiB 2026-03-31T20:24:54.981 INFO:tasks.workunit.client.0.vm03.stdout: usage: 83 MiB used, 270 GiB / 270 GiB avail 2026-03-31T20:24:54.981 INFO:tasks.workunit.client.0.vm03.stdout: pgs: 9 active+clean 2026-03-31T20:24:54.981 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:24:54.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:652: test_auth_profiles: ceph -n client.xx-profile-ro -k client.xx.keyring osd dump 2026-03-31T20:24:55.187 INFO:tasks.workunit.client.0.vm03.stdout:epoch 141 2026-03-31T20:24:55.187 INFO:tasks.workunit.client.0.vm03.stdout:fsid a4a0ca01-ae82-443e-a7c7-50605716689a 2026-03-31T20:24:55.187 INFO:tasks.workunit.client.0.vm03.stdout:created 2026-03-31T20:21:23.960433+0000 2026-03-31T20:24:55.187 INFO:tasks.workunit.client.0.vm03.stdout:modified 2026-03-31T20:24:44.806162+0000 2026-03-31T20:24:55.187 INFO:tasks.workunit.client.0.vm03.stdout:flags sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit 2026-03-31T20:24:55.187 INFO:tasks.workunit.client.0.vm03.stdout:crush_version 5 2026-03-31T20:24:55.187 INFO:tasks.workunit.client.0.vm03.stdout:full_ratio 0.95 2026-03-31T20:24:55.187 INFO:tasks.workunit.client.0.vm03.stdout:backfillfull_ratio 0.9 2026-03-31T20:24:55.187 INFO:tasks.workunit.client.0.vm03.stdout:nearfull_ratio 0.85 2026-03-31T20:24:55.187 INFO:tasks.workunit.client.0.vm03.stdout:require_min_compat_client luminous 2026-03-31T20:24:55.187 INFO:tasks.workunit.client.0.vm03.stdout:min_compat_client jewel 2026-03-31T20:24:55.187 INFO:tasks.workunit.client.0.vm03.stdout:require_osd_release tentacle 2026-03-31T20:24:55.187 INFO:tasks.workunit.client.0.vm03.stdout:stretch_mode_enabled false 2026-03-31T20:24:55.187 INFO:tasks.workunit.client.0.vm03.stdout:pool 1 '.mgr' replicated size 2 min_size 1 crush_rule 0 object_hash rjenkins pg_num 1 pgp_num 1 autoscale_mode off last_change 11 flags hashpspool stripe_width 0 pg_num_max 32 pg_num_min 1 application mgr read_balance_score 2.99 2026-03-31T20:24:55.187 INFO:tasks.workunit.client.0.vm03.stdout:pool 2 'rbd' replicated size 2 min_size 1 crush_rule 0 object_hash rjenkins pg_num 8 pgp_num 8 autoscale_mode off last_change 15 flags hashpspool,selfmanaged_snaps stripe_width 0 application rbd read_balance_score 1.88 2026-03-31T20:24:55.187 INFO:tasks.workunit.client.0.vm03.stdout:max_osd 3 2026-03-31T20:24:55.187 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 8 up_thru 132 down_at 0 last_clean_interval [0,0) v2:192.168.123.103:6804/950776786 v2:192.168.123.103:6805/950776786 exists,up f3bff22c-a21c-470a-aec1-c158b49523b8 2026-03-31T20:24:55.187 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 8 up_thru 133 down_at 0 last_clean_interval [0,0) v2:192.168.123.103:6800/3578452477 v2:192.168.123.103:6801/3578452477 exists,up ea32c3bc-b054-4d75-ba06-731b384d7a6a 2026-03-31T20:24:55.187 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 8 up_thru 132 down_at 0 last_clean_interval [0,0) v2:192.168.123.103:6808/1523795840 v2:192.168.123.103:6809/1523795840 exists,up b8c0450a-00e1-4ca8-b0ac-3f046a80f1b8 2026-03-31T20:24:55.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:653: test_auth_profiles: ceph -n client.xx-profile-ro -k client.xx.keyring pg dump 2026-03-31T20:24:55.389 INFO:tasks.workunit.client.0.vm03.stdout:version 241 2026-03-31T20:24:55.389 INFO:tasks.workunit.client.0.vm03.stdout:stamp 2026-03-31T20:24:54.547419+0000 2026-03-31T20:24:55.389 INFO:tasks.workunit.client.0.vm03.stdout:last_osdmap_epoch 0 2026-03-31T20:24:55.389 INFO:tasks.workunit.client.0.vm03.stdout:last_pg_scan 0 2026-03-31T20:24:55.389 INFO:tasks.workunit.client.0.vm03.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-31T20:24:55.389 INFO:tasks.workunit.client.0.vm03.stdout:1.0 2 0 0 0 0 459280 0 0 32 0 32 active+clean 2026-03-31T20:21:30.016540+0000 10'32 141:318 [1,0] 1 [1,0] 1 0'0 2026-03-31T20:21:28.998269+0000 0'0 2026-03-31T20:21:28.998269+0000 0 0 periodic scrub scheduled @ 2026-04-02T05:07:29.140176+0000 0 0 2026-03-31T20:24:55.389 INFO:tasks.workunit.client.0.vm03.stdout:2.3 1 0 0 0 0 0 0 0 1 0 1 active+clean 2026-03-31T20:21:35.038258+0000 13'1 141:274 [1,2] 1 [1,2] 1 0'0 2026-03-31T20:21:32.020592+0000 0'0 2026-03-31T20:21:32.020592+0000 0 0 periodic scrub scheduled @ 2026-04-01T23:34:37.955562+0000 0 0 2026-03-31T20:24:55.389 INFO:tasks.workunit.client.0.vm03.stdout:2.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-31T20:21:35.041462+0000 0'0 141:273 [2,1] 2 [2,1] 2 0'0 2026-03-31T20:21:32.020592+0000 0'0 2026-03-31T20:21:32.020592+0000 0 0 periodic scrub scheduled @ 2026-04-02T01:46:03.277652+0000 0 0 2026-03-31T20:24:55.389 INFO:tasks.workunit.client.0.vm03.stdout:2.1 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-31T20:21:35.041386+0000 0'0 141:273 [2,1] 2 [2,1] 2 0'0 2026-03-31T20:21:32.020592+0000 0'0 2026-03-31T20:21:32.020592+0000 0 0 periodic scrub scheduled @ 2026-04-01T22:00:11.952577+0000 0 0 2026-03-31T20:24:55.389 INFO:tasks.workunit.client.0.vm03.stdout:2.2 1 0 0 0 0 19 0 0 2 0 2 active+clean 2026-03-31T20:21:35.037546+0000 15'2 141:276 [0,1] 0 [0,1] 0 0'0 2026-03-31T20:21:32.020592+0000 0'0 2026-03-31T20:21:32.020592+0000 0 0 periodic scrub scheduled @ 2026-04-01T21:10:55.509790+0000 0 0 2026-03-31T20:24:55.389 INFO:tasks.workunit.client.0.vm03.stdout:2.4 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-31T20:21:35.038061+0000 0'0 141:273 [1,0] 1 [1,0] 1 0'0 2026-03-31T20:21:32.020592+0000 0'0 2026-03-31T20:21:32.020592+0000 0 0 periodic scrub scheduled @ 2026-04-02T00:01:36.825157+0000 0 0 2026-03-31T20:24:55.389 INFO:tasks.workunit.client.0.vm03.stdout:2.5 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-31T20:21:35.038056+0000 0'0 141:273 [1,0] 1 [1,0] 1 0'0 2026-03-31T20:21:32.020592+0000 0'0 2026-03-31T20:21:32.020592+0000 0 0 periodic scrub scheduled @ 2026-04-02T07:12:07.333081+0000 0 0 2026-03-31T20:24:55.390 INFO:tasks.workunit.client.0.vm03.stdout:2.6 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-31T20:21:35.037923+0000 0'0 141:273 [1,0] 1 [1,0] 1 0'0 2026-03-31T20:21:32.020592+0000 0'0 2026-03-31T20:21:32.020592+0000 0 0 periodic scrub scheduled @ 2026-04-01T23:12:47.404171+0000 0 0 2026-03-31T20:24:55.390 INFO:tasks.workunit.client.0.vm03.stdout:2.7 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-31T20:21:35.038270+0000 0'0 141:273 [1,0] 1 [1,0] 1 0'0 2026-03-31T20:21:32.020592+0000 0'0 2026-03-31T20:21:32.020592+0000 0 0 periodic scrub scheduled @ 2026-04-01T21:37:49.889387+0000 0 0 2026-03-31T20:24:55.390 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:24:55.390 INFO:tasks.workunit.client.0.vm03.stdout:2 2 0 0 0 0 19 0 0 3 3 2026-03-31T20:24:55.390 INFO:tasks.workunit.client.0.vm03.stdout:1 2 0 0 0 0 459280 0 0 32 32 2026-03-31T20:24:55.390 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:24:55.390 INFO:tasks.workunit.client.0.vm03.stdout:sum 4 0 0 0 0 459299 0 0 35 35 2026-03-31T20:24:55.390 INFO:tasks.workunit.client.0.vm03.stdout:OSD_STAT USED AVAIL USED_RAW TOTAL HB_PEERS PG_SUM PRIMARY_PG_SUM 2026-03-31T20:24:55.390 INFO:tasks.workunit.client.0.vm03.stdout:2 27 MiB 90 GiB 27 MiB 90 GiB [0,1] 3 2 2026-03-31T20:24:55.390 INFO:tasks.workunit.client.0.vm03.stdout:1 28 MiB 90 GiB 28 MiB 90 GiB [0,2] 9 6 2026-03-31T20:24:55.390 INFO:tasks.workunit.client.0.vm03.stdout:0 28 MiB 90 GiB 28 MiB 90 GiB [1,2] 6 1 2026-03-31T20:24:55.390 INFO:tasks.workunit.client.0.vm03.stdout:sum 83 MiB 270 GiB 83 MiB 270 GiB 2026-03-31T20:24:55.390 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:24:55.390 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-31T20:24:55.390 INFO:tasks.workunit.client.0.vm03.stderr:dumped all 2026-03-31T20:24:55.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:654: test_auth_profiles: ceph -n client.xx-profile-ro -k client.xx.keyring mon dump 2026-03-31T20:24:55.666 INFO:tasks.workunit.client.0.vm03.stdout:epoch 1 2026-03-31T20:24:55.666 INFO:tasks.workunit.client.0.vm03.stdout:fsid a4a0ca01-ae82-443e-a7c7-50605716689a 2026-03-31T20:24:55.666 INFO:tasks.workunit.client.0.vm03.stdout:last_changed 2026-03-31T20:21:18.374590+0000 2026-03-31T20:24:55.666 INFO:tasks.workunit.client.0.vm03.stdout:created 2026-03-31T20:21:18.374590+0000 2026-03-31T20:24:55.666 INFO:tasks.workunit.client.0.vm03.stdout:min_mon_release 20 (tentacle) 2026-03-31T20:24:55.666 INFO:tasks.workunit.client.0.vm03.stdout:election_strategy: 3 2026-03-31T20:24:55.666 INFO:tasks.workunit.client.0.vm03.stdout:0: [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] mon.a 2026-03-31T20:24:55.666 INFO:tasks.workunit.client.0.vm03.stdout:1: [v2:192.168.123.103:3301/0,v1:192.168.123.103:6790/0] mon.b 2026-03-31T20:24:55.666 INFO:tasks.workunit.client.0.vm03.stdout:2: [v2:192.168.123.103:3302/0,v1:192.168.123.103:6791/0] mon.c 2026-03-31T20:24:55.666 INFO:tasks.workunit.client.0.vm03.stderr:dumped monmap epoch 1 2026-03-31T20:24:55.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:656: test_auth_profiles: ceph -n client.xx-profile-ro -k client.xx.keyring log foo 2026-03-31T20:24:55.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:656: test_auth_profiles: true 2026-03-31T20:24:55.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:657: test_auth_profiles: check_response 'EACCES: access denied' 2026-03-31T20:24:55.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='EACCES: access denied' 2026-03-31T20:24:55.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode= 2026-03-31T20:24:55.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode= 2026-03-31T20:24:55.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' '' -a '!=' ']' 2026-03-31T20:24:55.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'EACCES: access denied' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:24:55.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:658: test_auth_profiles: ceph -n client.xx-profile-ro -k client.xx.keyring osd set noout 2026-03-31T20:24:56.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:658: test_auth_profiles: true 2026-03-31T20:24:56.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:659: test_auth_profiles: check_response 'EACCES: access denied' 2026-03-31T20:24:56.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='EACCES: access denied' 2026-03-31T20:24:56.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode= 2026-03-31T20:24:56.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode= 2026-03-31T20:24:56.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' '' -a '!=' ']' 2026-03-31T20:24:56.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'EACCES: access denied' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:24:56.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:660: test_auth_profiles: ceph -n client.xx-profile-ro -k client.xx.keyring auth ls 2026-03-31T20:24:56.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:660: test_auth_profiles: true 2026-03-31T20:24:56.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:661: test_auth_profiles: check_response 'EACCES: access denied' 2026-03-31T20:24:56.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='EACCES: access denied' 2026-03-31T20:24:56.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode= 2026-03-31T20:24:56.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode= 2026-03-31T20:24:56.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' '' -a '!=' ']' 2026-03-31T20:24:56.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'EACCES: access denied' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:24:56.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:664: test_auth_profiles: ceph -n client.xx-profile-rw -k client.xx.keyring status 2026-03-31T20:24:56.459 INFO:tasks.workunit.client.0.vm03.stdout: cluster: 2026-03-31T20:24:56.459 INFO:tasks.workunit.client.0.vm03.stdout: id: a4a0ca01-ae82-443e-a7c7-50605716689a 2026-03-31T20:24:56.459 INFO:tasks.workunit.client.0.vm03.stdout: health: HEALTH_OK 2026-03-31T20:24:56.459 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:24:56.459 INFO:tasks.workunit.client.0.vm03.stdout: services: 2026-03-31T20:24:56.459 INFO:tasks.workunit.client.0.vm03.stdout: mon: 3 daemons, quorum a,b,c (age 3m) [leader: a] 2026-03-31T20:24:56.459 INFO:tasks.workunit.client.0.vm03.stdout: mgr: x(active, since 3m) 2026-03-31T20:24:56.459 INFO:tasks.workunit.client.0.vm03.stdout: osd: 3 osds: 3 up (since 3m), 3 in (since 3m) 2026-03-31T20:24:56.459 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:24:56.459 INFO:tasks.workunit.client.0.vm03.stdout: data: 2026-03-31T20:24:56.459 INFO:tasks.workunit.client.0.vm03.stdout: pools: 2 pools, 9 pgs 2026-03-31T20:24:56.459 INFO:tasks.workunit.client.0.vm03.stdout: objects: 4 objects, 449 KiB 2026-03-31T20:24:56.459 INFO:tasks.workunit.client.0.vm03.stdout: usage: 83 MiB used, 270 GiB / 270 GiB avail 2026-03-31T20:24:56.459 INFO:tasks.workunit.client.0.vm03.stdout: pgs: 9 active+clean 2026-03-31T20:24:56.459 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:24:56.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:665: test_auth_profiles: ceph -n client.xx-profile-rw -k client.xx.keyring osd dump 2026-03-31T20:24:56.667 INFO:tasks.workunit.client.0.vm03.stdout:epoch 141 2026-03-31T20:24:56.667 INFO:tasks.workunit.client.0.vm03.stdout:fsid a4a0ca01-ae82-443e-a7c7-50605716689a 2026-03-31T20:24:56.667 INFO:tasks.workunit.client.0.vm03.stdout:created 2026-03-31T20:21:23.960433+0000 2026-03-31T20:24:56.667 INFO:tasks.workunit.client.0.vm03.stdout:modified 2026-03-31T20:24:44.806162+0000 2026-03-31T20:24:56.667 INFO:tasks.workunit.client.0.vm03.stdout:flags sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit 2026-03-31T20:24:56.667 INFO:tasks.workunit.client.0.vm03.stdout:crush_version 5 2026-03-31T20:24:56.667 INFO:tasks.workunit.client.0.vm03.stdout:full_ratio 0.95 2026-03-31T20:24:56.667 INFO:tasks.workunit.client.0.vm03.stdout:backfillfull_ratio 0.9 2026-03-31T20:24:56.667 INFO:tasks.workunit.client.0.vm03.stdout:nearfull_ratio 0.85 2026-03-31T20:24:56.667 INFO:tasks.workunit.client.0.vm03.stdout:require_min_compat_client luminous 2026-03-31T20:24:56.667 INFO:tasks.workunit.client.0.vm03.stdout:min_compat_client jewel 2026-03-31T20:24:56.667 INFO:tasks.workunit.client.0.vm03.stdout:require_osd_release tentacle 2026-03-31T20:24:56.667 INFO:tasks.workunit.client.0.vm03.stdout:stretch_mode_enabled false 2026-03-31T20:24:56.667 INFO:tasks.workunit.client.0.vm03.stdout:pool 1 '.mgr' replicated size 2 min_size 1 crush_rule 0 object_hash rjenkins pg_num 1 pgp_num 1 autoscale_mode off last_change 11 flags hashpspool stripe_width 0 pg_num_max 32 pg_num_min 1 application mgr read_balance_score 2.99 2026-03-31T20:24:56.667 INFO:tasks.workunit.client.0.vm03.stdout:pool 2 'rbd' replicated size 2 min_size 1 crush_rule 0 object_hash rjenkins pg_num 8 pgp_num 8 autoscale_mode off last_change 15 flags hashpspool,selfmanaged_snaps stripe_width 0 application rbd read_balance_score 1.88 2026-03-31T20:24:56.667 INFO:tasks.workunit.client.0.vm03.stdout:max_osd 3 2026-03-31T20:24:56.667 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 8 up_thru 132 down_at 0 last_clean_interval [0,0) v2:192.168.123.103:6804/950776786 v2:192.168.123.103:6805/950776786 exists,up f3bff22c-a21c-470a-aec1-c158b49523b8 2026-03-31T20:24:56.667 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 8 up_thru 133 down_at 0 last_clean_interval [0,0) v2:192.168.123.103:6800/3578452477 v2:192.168.123.103:6801/3578452477 exists,up ea32c3bc-b054-4d75-ba06-731b384d7a6a 2026-03-31T20:24:56.667 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 8 up_thru 132 down_at 0 last_clean_interval [0,0) v2:192.168.123.103:6808/1523795840 v2:192.168.123.103:6809/1523795840 exists,up b8c0450a-00e1-4ca8-b0ac-3f046a80f1b8 2026-03-31T20:24:56.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:666: test_auth_profiles: ceph -n client.xx-profile-rw -k client.xx.keyring pg dump 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stdout:version 242 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stdout:stamp 2026-03-31T20:24:56.547629+0000 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stdout:last_osdmap_epoch 0 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stdout:last_pg_scan 0 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stdout:1.0 2 0 0 0 0 459280 0 0 32 0 32 active+clean 2026-03-31T20:21:30.016540+0000 10'32 141:318 [1,0] 1 [1,0] 1 0'0 2026-03-31T20:21:28.998269+0000 0'0 2026-03-31T20:21:28.998269+0000 0 0 periodic scrub scheduled @ 2026-04-02T05:07:29.140176+0000 0 0 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stdout:2.3 1 0 0 0 0 0 0 0 1 0 1 active+clean 2026-03-31T20:21:35.038258+0000 13'1 141:274 [1,2] 1 [1,2] 1 0'0 2026-03-31T20:21:32.020592+0000 0'0 2026-03-31T20:21:32.020592+0000 0 0 periodic scrub scheduled @ 2026-04-01T23:34:37.955562+0000 0 0 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stdout:2.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-31T20:21:35.041462+0000 0'0 141:273 [2,1] 2 [2,1] 2 0'0 2026-03-31T20:21:32.020592+0000 0'0 2026-03-31T20:21:32.020592+0000 0 0 periodic scrub scheduled @ 2026-04-02T01:46:03.277652+0000 0 0 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stdout:2.1 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-31T20:21:35.041386+0000 0'0 141:273 [2,1] 2 [2,1] 2 0'0 2026-03-31T20:21:32.020592+0000 0'0 2026-03-31T20:21:32.020592+0000 0 0 periodic scrub scheduled @ 2026-04-01T22:00:11.952577+0000 0 0 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stdout:2.2 1 0 0 0 0 19 0 0 2 0 2 active+clean 2026-03-31T20:21:35.037546+0000 15'2 141:276 [0,1] 0 [0,1] 0 0'0 2026-03-31T20:21:32.020592+0000 0'0 2026-03-31T20:21:32.020592+0000 0 0 periodic scrub scheduled @ 2026-04-01T21:10:55.509790+0000 0 0 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stdout:2.4 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-31T20:21:35.038061+0000 0'0 141:273 [1,0] 1 [1,0] 1 0'0 2026-03-31T20:21:32.020592+0000 0'0 2026-03-31T20:21:32.020592+0000 0 0 periodic scrub scheduled @ 2026-04-02T00:01:36.825157+0000 0 0 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stdout:2.5 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-31T20:21:35.038056+0000 0'0 141:273 [1,0] 1 [1,0] 1 0'0 2026-03-31T20:21:32.020592+0000 0'0 2026-03-31T20:21:32.020592+0000 0 0 periodic scrub scheduled @ 2026-04-02T07:12:07.333081+0000 0 0 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stdout:2.6 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-31T20:21:35.037923+0000 0'0 141:273 [1,0] 1 [1,0] 1 0'0 2026-03-31T20:21:32.020592+0000 0'0 2026-03-31T20:21:32.020592+0000 0 0 periodic scrub scheduled @ 2026-04-01T23:12:47.404171+0000 0 0 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stdout:2.7 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-31T20:21:35.038270+0000 0'0 141:273 [1,0] 1 [1,0] 1 0'0 2026-03-31T20:21:32.020592+0000 0'0 2026-03-31T20:21:32.020592+0000 0 0 periodic scrub scheduled @ 2026-04-01T21:37:49.889387+0000 0 0 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stdout:2 2 0 0 0 0 19 0 0 3 3 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stdout:1 2 0 0 0 0 459280 0 0 32 32 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stdout:sum 4 0 0 0 0 459299 0 0 35 35 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stdout:OSD_STAT USED AVAIL USED_RAW TOTAL HB_PEERS PG_SUM PRIMARY_PG_SUM 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stdout:2 27 MiB 90 GiB 27 MiB 90 GiB [0,1] 3 2 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stdout:1 28 MiB 90 GiB 28 MiB 90 GiB [0,2] 9 6 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stdout:0 28 MiB 90 GiB 28 MiB 90 GiB [1,2] 6 1 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stdout:sum 83 MiB 270 GiB 83 MiB 270 GiB 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-31T20:24:56.872 INFO:tasks.workunit.client.0.vm03.stderr:dumped all 2026-03-31T20:24:56.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:667: test_auth_profiles: ceph -n client.xx-profile-rw -k client.xx.keyring mon dump 2026-03-31T20:24:57.139 INFO:tasks.workunit.client.0.vm03.stdout:epoch 1 2026-03-31T20:24:57.139 INFO:tasks.workunit.client.0.vm03.stdout:fsid a4a0ca01-ae82-443e-a7c7-50605716689a 2026-03-31T20:24:57.139 INFO:tasks.workunit.client.0.vm03.stdout:last_changed 2026-03-31T20:21:18.374590+0000 2026-03-31T20:24:57.139 INFO:tasks.workunit.client.0.vm03.stdout:created 2026-03-31T20:21:18.374590+0000 2026-03-31T20:24:57.139 INFO:tasks.workunit.client.0.vm03.stdout:min_mon_release 20 (tentacle) 2026-03-31T20:24:57.139 INFO:tasks.workunit.client.0.vm03.stdout:election_strategy: 3 2026-03-31T20:24:57.139 INFO:tasks.workunit.client.0.vm03.stdout:0: [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] mon.a 2026-03-31T20:24:57.139 INFO:tasks.workunit.client.0.vm03.stdout:1: [v2:192.168.123.103:3301/0,v1:192.168.123.103:6790/0] mon.b 2026-03-31T20:24:57.139 INFO:tasks.workunit.client.0.vm03.stdout:2: [v2:192.168.123.103:3302/0,v1:192.168.123.103:6791/0] mon.c 2026-03-31T20:24:57.139 INFO:tasks.workunit.client.0.vm03.stderr:dumped monmap epoch 1 2026-03-31T20:24:57.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:668: test_auth_profiles: ceph -n client.xx-profile-rw -k client.xx.keyring fs dump 2026-03-31T20:24:57.378 INFO:tasks.workunit.client.0.vm03.stdout:e1 2026-03-31T20:24:57.378 INFO:tasks.workunit.client.0.vm03.stdout:btime 2026-03-31T20:21:23:959968+0000 2026-03-31T20:24:57.378 INFO:tasks.workunit.client.0.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-31T20:24:57.378 INFO:tasks.workunit.client.0.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-31T20:24:57.378 INFO:tasks.workunit.client.0.vm03.stdout:legacy client fscid: -1 2026-03-31T20:24:57.378 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:24:57.378 INFO:tasks.workunit.client.0.vm03.stdout:No filesystems configured 2026-03-31T20:24:57.378 INFO:tasks.workunit.client.0.vm03.stderr:dumped fsmap epoch 1 2026-03-31T20:24:57.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:669: test_auth_profiles: ceph -n client.xx-profile-rw -k client.xx.keyring log foo 2026-03-31T20:24:58.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:670: test_auth_profiles: ceph -n client.xx-profile-rw -k client.xx.keyring osd set noout 2026-03-31T20:25:00.887 INFO:tasks.workunit.client.0.vm03.stderr:noout is set 2026-03-31T20:25:00.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:671: test_auth_profiles: ceph -n client.xx-profile-rw -k client.xx.keyring osd unset noout 2026-03-31T20:25:02.895 INFO:tasks.workunit.client.0.vm03.stderr:noout is unset 2026-03-31T20:25:02.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:673: test_auth_profiles: ceph -n client.xx-profile-rw -k client.xx.keyring auth ls 2026-03-31T20:25:03.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:673: test_auth_profiles: true 2026-03-31T20:25:03.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:674: test_auth_profiles: check_response 'EACCES: access denied' 2026-03-31T20:25:03.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='EACCES: access denied' 2026-03-31T20:25:03.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode= 2026-03-31T20:25:03.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode= 2026-03-31T20:25:03.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' '' -a '!=' ']' 2026-03-31T20:25:03.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'EACCES: access denied' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:25:03.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:677: test_auth_profiles: ceph -n client.xx-profile-rd -k client.xx.keyring auth ls 2026-03-31T20:25:03.338 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 2026-03-31T20:25:03.338 INFO:tasks.workunit.client.0.vm03.stdout: key: AQC/LMxppFQZIxAAvxt8xEb3Mmpb8s5fgCIo/g== 2026-03-31T20:25:03.338 INFO:tasks.workunit.client.0.vm03.stdout: caps: [mgr] allow profile osd 2026-03-31T20:25:03.338 INFO:tasks.workunit.client.0.vm03.stdout: caps: [mon] allow profile osd 2026-03-31T20:25:03.338 INFO:tasks.workunit.client.0.vm03.stdout: caps: [osd] allow * 2026-03-31T20:25:03.338 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 2026-03-31T20:25:03.338 INFO:tasks.workunit.client.0.vm03.stdout: key: AQDALMxpcV3jIRAARQlDPsae5Lemc/N64DANTA== 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [mgr] allow profile osd 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [mon] allow profile osd 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [osd] allow * 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: key: AQDBLMxpNOpSHxAAkKNTOz94jfGqrCWsY6plTw== 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [mgr] allow profile osd 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [mon] allow profile osd 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [osd] allow * 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout:client.0 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: key: AQC+LMxpiW+EMxAAwuZj5IVhvRvsW21mfP6+CA== 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [mds] allow 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [mgr] allow r 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [mon] allow rw 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [osd] allow rwx 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout:client.admin 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: key: AQC+LMxpKFl4IBAAAZ/Y8PyzLJzLfa0wwu/4ZA== 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [mds] allow * 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [mgr] allow * 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [mon] allow * 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [osd] allow * 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout:client.bootstrap-mds 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: key: AQDDLMxpqvlBORAAHc8eu91UoEbKQyyoDXswQA== 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [mon] allow profile bootstrap-mds 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout:client.bootstrap-mgr 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: key: AQDDLMxpKv5BORAAJ9qF6OydIf8mjiNE/mRUVw== 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [mon] allow profile bootstrap-mgr 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout:client.bootstrap-osd 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: key: AQDDLMxp4gFCORAA8dHZcOtuUFZjMBsjXra2wQ== 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [mon] allow profile bootstrap-osd 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout:client.bootstrap-rbd 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: key: AQDDLMxpVAVCORAAyVvwOrVHH0fbz4Fi+oDjYA== 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [mon] allow profile bootstrap-rbd 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout:client.bootstrap-rbd-mirror 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: key: AQDDLMxpRwlCORAArFdDWsgbc7y3P13f2wegMA== 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [mon] allow profile bootstrap-rbd-mirror 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout:client.bootstrap-rgw 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: key: AQDDLMxpPA1CORAAZdxqbGDnCE5Qda8s+Py1Vg== 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [mon] allow profile bootstrap-rgw 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout:client.xx-profile-rd 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: key: AQCWLcxpKuXSFRAAlCX6cEJQmKQYZTNY6dd90w== 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [mon] allow profile role-definer 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout:client.xx-profile-ro 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: key: AQCVLcxpmEsqMRAA7l6iwSsVyQgoQtcLZ1/Xag== 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [mgr] allow profile read-only 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [mon] allow profile read-only 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout:client.xx-profile-rw 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: key: AQCWLcxppGPFBRAA3dDqsOqT31Bgk7c/1aNftA== 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [mgr] allow profile read-write 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [mon] allow profile read-write 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout:mgr.x 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: key: AQC+LMxpopCpLxAARdBofy8A8Mrzze92vD0opg== 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [mds] allow * 2026-03-31T20:25:03.339 INFO:tasks.workunit.client.0.vm03.stdout: caps: [mon] allow profile mgr 2026-03-31T20:25:03.340 INFO:tasks.workunit.client.0.vm03.stdout: caps: [osd] allow * 2026-03-31T20:25:03.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:678: test_auth_profiles: ceph -n client.xx-profile-rd -k client.xx.keyring auth export 2026-03-31T20:25:03.605 INFO:tasks.workunit.client.0.vm03.stdout:[osd.0] 2026-03-31T20:25:03.605 INFO:tasks.workunit.client.0.vm03.stdout: key = AQC/LMxppFQZIxAAvxt8xEb3Mmpb8s5fgCIo/g== 2026-03-31T20:25:03.605 INFO:tasks.workunit.client.0.vm03.stdout: caps mgr = "allow profile osd" 2026-03-31T20:25:03.605 INFO:tasks.workunit.client.0.vm03.stdout: caps mon = "allow profile osd" 2026-03-31T20:25:03.605 INFO:tasks.workunit.client.0.vm03.stdout: caps osd = "allow *" 2026-03-31T20:25:03.605 INFO:tasks.workunit.client.0.vm03.stdout:[osd.1] 2026-03-31T20:25:03.605 INFO:tasks.workunit.client.0.vm03.stdout: key = AQDALMxpcV3jIRAARQlDPsae5Lemc/N64DANTA== 2026-03-31T20:25:03.605 INFO:tasks.workunit.client.0.vm03.stdout: caps mgr = "allow profile osd" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps mon = "allow profile osd" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps osd = "allow *" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout:[osd.2] 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: key = AQDBLMxpNOpSHxAAkKNTOz94jfGqrCWsY6plTw== 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps mgr = "allow profile osd" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps mon = "allow profile osd" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps osd = "allow *" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout:[client.0] 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: key = AQC+LMxpiW+EMxAAwuZj5IVhvRvsW21mfP6+CA== 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps mds = "allow" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps mgr = "allow r" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps mon = "allow rw" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps osd = "allow rwx" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout:[client.admin] 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: key = AQC+LMxpKFl4IBAAAZ/Y8PyzLJzLfa0wwu/4ZA== 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps mds = "allow *" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps mgr = "allow *" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps mon = "allow *" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps osd = "allow *" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout:[client.bootstrap-mds] 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: key = AQDDLMxpqvlBORAAHc8eu91UoEbKQyyoDXswQA== 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps mon = "allow profile bootstrap-mds" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout:[client.bootstrap-mgr] 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: key = AQDDLMxpKv5BORAAJ9qF6OydIf8mjiNE/mRUVw== 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps mon = "allow profile bootstrap-mgr" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout:[client.bootstrap-osd] 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: key = AQDDLMxp4gFCORAA8dHZcOtuUFZjMBsjXra2wQ== 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps mon = "allow profile bootstrap-osd" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout:[client.bootstrap-rbd] 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: key = AQDDLMxpVAVCORAAyVvwOrVHH0fbz4Fi+oDjYA== 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps mon = "allow profile bootstrap-rbd" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout:[client.bootstrap-rbd-mirror] 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: key = AQDDLMxpRwlCORAArFdDWsgbc7y3P13f2wegMA== 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps mon = "allow profile bootstrap-rbd-mirror" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout:[client.bootstrap-rgw] 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: key = AQDDLMxpPA1CORAAZdxqbGDnCE5Qda8s+Py1Vg== 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps mon = "allow profile bootstrap-rgw" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout:[client.xx-profile-rd] 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: key = AQCWLcxpKuXSFRAAlCX6cEJQmKQYZTNY6dd90w== 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps mon = "allow profile role-definer" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout:[client.xx-profile-ro] 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: key = AQCVLcxpmEsqMRAA7l6iwSsVyQgoQtcLZ1/Xag== 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps mgr = "allow profile read-only" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps mon = "allow profile read-only" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout:[client.xx-profile-rw] 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: key = AQCWLcxppGPFBRAA3dDqsOqT31Bgk7c/1aNftA== 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps mgr = "allow profile read-write" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps mon = "allow profile read-write" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout:[mgr.x] 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: key = AQC+LMxpopCpLxAARdBofy8A8Mrzze92vD0opg== 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps mds = "allow *" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps mon = "allow profile mgr" 2026-03-31T20:25:03.606 INFO:tasks.workunit.client.0.vm03.stdout: caps osd = "allow *" 2026-03-31T20:25:03.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:679: test_auth_profiles: ceph -n client.xx-profile-rd -k client.xx.keyring auth add client.xx-profile-foo 2026-03-31T20:25:03.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:680: test_auth_profiles: ceph -n client.xx-profile-rd -k client.xx.keyring status 2026-03-31T20:25:04.139 INFO:tasks.workunit.client.0.vm03.stdout: cluster: 2026-03-31T20:25:04.139 INFO:tasks.workunit.client.0.vm03.stdout: id: a4a0ca01-ae82-443e-a7c7-50605716689a 2026-03-31T20:25:04.139 INFO:tasks.workunit.client.0.vm03.stdout: health: HEALTH_OK 2026-03-31T20:25:04.139 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:25:04.139 INFO:tasks.workunit.client.0.vm03.stdout: services: 2026-03-31T20:25:04.139 INFO:tasks.workunit.client.0.vm03.stdout: mon: 3 daemons, quorum a,b,c (age 3m) [leader: a] 2026-03-31T20:25:04.139 INFO:tasks.workunit.client.0.vm03.stdout: mgr: x(active, since 3m) 2026-03-31T20:25:04.139 INFO:tasks.workunit.client.0.vm03.stdout: osd: 3 osds: 3 up (since 3m), 3 in (since 3m) 2026-03-31T20:25:04.140 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:25:04.140 INFO:tasks.workunit.client.0.vm03.stdout: data: 2026-03-31T20:25:04.140 INFO:tasks.workunit.client.0.vm03.stdout: pools: 2 pools, 9 pgs 2026-03-31T20:25:04.140 INFO:tasks.workunit.client.0.vm03.stdout: objects: 4 objects, 449 KiB 2026-03-31T20:25:04.140 INFO:tasks.workunit.client.0.vm03.stdout: usage: 83 MiB used, 270 GiB / 270 GiB avail 2026-03-31T20:25:04.140 INFO:tasks.workunit.client.0.vm03.stdout: pgs: 9 active+clean 2026-03-31T20:25:04.140 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:25:04.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:681: test_auth_profiles: ceph -n client.xx-profile-rd -k client.xx.keyring osd dump 2026-03-31T20:25:04.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:681: test_auth_profiles: true 2026-03-31T20:25:04.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:682: test_auth_profiles: check_response 'EACCES: access denied' 2026-03-31T20:25:04.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='EACCES: access denied' 2026-03-31T20:25:04.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode= 2026-03-31T20:25:04.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode= 2026-03-31T20:25:04.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' '' -a '!=' ']' 2026-03-31T20:25:04.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'EACCES: access denied' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:25:04.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:683: test_auth_profiles: ceph -n client.xx-profile-rd -k client.xx.keyring pg dump 2026-03-31T20:25:04.441 INFO:tasks.ceph.mgr.x.vm03.stderr:2026-03-31T20:25:04.436+0000 7f6f57b35640 -1 mgr.server reply reply (13) Permission denied access denied: does your client key have mgr caps? See http://docs.ceph.com/en/latest/mgr/administrator/#client-authentication 2026-03-31T20:25:04.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:683: test_auth_profiles: true 2026-03-31T20:25:04.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:684: test_auth_profiles: check_response 'EACCES: access denied' 2026-03-31T20:25:04.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='EACCES: access denied' 2026-03-31T20:25:04.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode= 2026-03-31T20:25:04.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode= 2026-03-31T20:25:04.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' '' -a '!=' ']' 2026-03-31T20:25:04.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'EACCES: access denied' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:25:04.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:686: test_auth_profiles: ceph -n client.xx-profile-rd -k client.xx.keyring mon dump 2026-03-31T20:25:04.698 INFO:tasks.workunit.client.0.vm03.stdout:epoch 1 2026-03-31T20:25:04.698 INFO:tasks.workunit.client.0.vm03.stdout:fsid a4a0ca01-ae82-443e-a7c7-50605716689a 2026-03-31T20:25:04.698 INFO:tasks.workunit.client.0.vm03.stdout:last_changed 2026-03-31T20:21:18.374590+0000 2026-03-31T20:25:04.698 INFO:tasks.workunit.client.0.vm03.stdout:created 2026-03-31T20:21:18.374590+0000 2026-03-31T20:25:04.698 INFO:tasks.workunit.client.0.vm03.stdout:min_mon_release 20 (tentacle) 2026-03-31T20:25:04.698 INFO:tasks.workunit.client.0.vm03.stdout:election_strategy: 3 2026-03-31T20:25:04.698 INFO:tasks.workunit.client.0.vm03.stdout:0: [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] mon.a 2026-03-31T20:25:04.698 INFO:tasks.workunit.client.0.vm03.stdout:1: [v2:192.168.123.103:3301/0,v1:192.168.123.103:6790/0] mon.b 2026-03-31T20:25:04.698 INFO:tasks.workunit.client.0.vm03.stdout:2: [v2:192.168.123.103:3302/0,v1:192.168.123.103:6791/0] mon.c 2026-03-31T20:25:04.698 INFO:tasks.workunit.client.0.vm03.stderr:dumped monmap epoch 1 2026-03-31T20:25:04.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:688: test_auth_profiles: ceph -n client.xx-profile-rd -k client.xx.keyring mon add foo 1.1.1.1 2026-03-31T20:25:04.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:688: test_auth_profiles: true 2026-03-31T20:25:04.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:689: test_auth_profiles: check_response 'EACCES: access denied' 2026-03-31T20:25:04.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='EACCES: access denied' 2026-03-31T20:25:04.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode= 2026-03-31T20:25:04.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode= 2026-03-31T20:25:04.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' '' -a '!=' ']' 2026-03-31T20:25:04.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'EACCES: access denied' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:25:04.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:690: test_auth_profiles: ceph -n client.xx-profile-rd -k client.xx.keyring fs dump 2026-03-31T20:25:05.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:690: test_auth_profiles: true 2026-03-31T20:25:05.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:691: test_auth_profiles: check_response 'EACCES: access denied' 2026-03-31T20:25:05.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='EACCES: access denied' 2026-03-31T20:25:05.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode= 2026-03-31T20:25:05.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode= 2026-03-31T20:25:05.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' '' -a '!=' ']' 2026-03-31T20:25:05.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'EACCES: access denied' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:25:05.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:692: test_auth_profiles: ceph -n client.xx-profile-rd -k client.xx.keyring log foo 2026-03-31T20:25:05.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:692: test_auth_profiles: true 2026-03-31T20:25:05.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:693: test_auth_profiles: check_response 'EACCES: access denied' 2026-03-31T20:25:05.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='EACCES: access denied' 2026-03-31T20:25:05.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode= 2026-03-31T20:25:05.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode= 2026-03-31T20:25:05.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' '' -a '!=' ']' 2026-03-31T20:25:05.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'EACCES: access denied' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:25:05.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:694: test_auth_profiles: ceph -n client.xx-profile-rd -k client.xx.keyring osd set noout 2026-03-31T20:25:05.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:694: test_auth_profiles: true 2026-03-31T20:25:05.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:695: test_auth_profiles: check_response 'EACCES: access denied' 2026-03-31T20:25:05.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='EACCES: access denied' 2026-03-31T20:25:05.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode= 2026-03-31T20:25:05.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode= 2026-03-31T20:25:05.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' '' -a '!=' ']' 2026-03-31T20:25:05.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'EACCES: access denied' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:25:05.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:697: test_auth_profiles: ceph -n client.xx-profile-rd -k client.xx.keyring auth del client.xx-profile-ro 2026-03-31T20:25:05.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:698: test_auth_profiles: ceph -n client.xx-profile-rd -k client.xx.keyring auth del client.xx-profile-rw 2026-03-31T20:25:05.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:701: test_auth_profiles: ceph -n client.xx-profile-rd -k client.xx.keyring auth add client.xx-profile-rd2 mon 'allow profile role-definer' 2026-03-31T20:25:06.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:703: test_auth_profiles: ceph -n client.xx-profile-rd -k client.xx.keyring auth export 2026-03-31T20:25:06.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:706: test_auth_profiles: ceph -n client.xx-profile-rd2 -k client.xx.keyring.2 auth del client.xx-profile-rd 2026-03-31T20:25:06.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:709: test_auth_profiles: ceph auth del client.xx-profile-rd2 2026-03-31T20:25:07.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:710: test_auth_profiles: rm -f client.xx.keyring client.xx.keyring.2 2026-03-31T20:25:07.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:25:07.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mon_misc 2026-03-31T20:25:07.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:739: test_mon_misc: ceph osd dump 2026-03-31T20:25:07.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:739: test_mon_misc: grep '^epoch' 2026-03-31T20:25:07.428 INFO:tasks.workunit.client.0.vm03.stdout:epoch 145 2026-03-31T20:25:07.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:740: test_mon_misc: ceph --concise osd dump 2026-03-31T20:25:07.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:740: test_mon_misc: grep '^epoch' 2026-03-31T20:25:07.631 INFO:tasks.workunit.client.0.vm03.stdout:epoch 145 2026-03-31T20:25:07.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:742: test_mon_misc: ceph osd df 2026-03-31T20:25:07.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:742: test_mon_misc: grep 'MIN/MAX VAR' 2026-03-31T20:25:07.830 INFO:tasks.workunit.client.0.vm03.stdout:MIN/MAX VAR: 0.99/1.00 STDDEV: 0 2026-03-31T20:25:07.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:745: test_mon_misc: ceph df 2026-03-31T20:25:08.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:746: test_mon_misc: grep RAW /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:25:08.098 INFO:tasks.workunit.client.0.vm03.stdout:--- RAW STORAGE --- 2026-03-31T20:25:08.098 INFO:tasks.workunit.client.0.vm03.stdout:CLASS SIZE AVAIL USED RAW USED %RAW USED 2026-03-31T20:25:08.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:747: test_mon_misc: grep -v DIRTY /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:25:08.099 INFO:tasks.workunit.client.0.vm03.stdout:--- RAW STORAGE --- 2026-03-31T20:25:08.099 INFO:tasks.workunit.client.0.vm03.stdout:CLASS SIZE AVAIL USED RAW USED %RAW USED 2026-03-31T20:25:08.099 INFO:tasks.workunit.client.0.vm03.stdout:hdd 270 GiB 270 GiB 83 MiB 83 MiB 0.03 2026-03-31T20:25:08.099 INFO:tasks.workunit.client.0.vm03.stdout:TOTAL 270 GiB 270 GiB 83 MiB 83 MiB 0.03 2026-03-31T20:25:08.099 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:25:08.099 INFO:tasks.workunit.client.0.vm03.stdout:--- POOLS --- 2026-03-31T20:25:08.099 INFO:tasks.workunit.client.0.vm03.stdout:POOL ID PGS STORED OBJECTS USED %USED MAX AVAIL 2026-03-31T20:25:08.099 INFO:tasks.workunit.client.0.vm03.stdout:.mgr 1 1 449 KiB 2 464 KiB 0 128 GiB 2026-03-31T20:25:08.099 INFO:tasks.workunit.client.0.vm03.stdout:rbd 2 8 19 B 2 8 KiB 0 128 GiB 2026-03-31T20:25:08.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:748: test_mon_misc: ceph df detail 2026-03-31T20:25:08.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:749: test_mon_misc: grep DIRTY /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:25:08.365 INFO:tasks.workunit.client.0.vm03.stdout:POOL ID PGS STORED (DATA) (OMAP) OBJECTS USED (DATA) (OMAP) %USED MAX AVAIL QUOTA OBJECTS QUOTA BYTES DIRTY USED COMPR UNDER COMPR 2026-03-31T20:25:08.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:750: test_mon_misc: ceph df --format json 2026-03-31T20:25:08.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:751: test_mon_misc: grep total_bytes /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:25:08.631 INFO:tasks.workunit.client.0.vm03.stdout:{"stats":{"total_bytes":289910292480,"total_avail_bytes":289823539200,"total_used_bytes":86753280,"total_used_raw_bytes":86753280,"total_used_raw_ratio":0.00029924180125817657,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3},"stats_by_class":{"hdd":{"total_bytes":289910292480,"total_avail_bytes":289823539200,"total_used_bytes":86753280,"total_used_raw_bytes":86753280,"total_used_raw_ratio":0.00029924180125817657}},"pools":[{"name":".mgr","id":1,"stats":{"stored":459280,"objects":2,"kb_used":464,"bytes_used":475136,"percent_used":1.7257076478927047e-06,"max_avail":137663873024}},{"name":"rbd","id":2,"stats":{"stored":19,"objects":2,"kb_used":8,"bytes_used":8192,"percent_used":2.9753630670370512e-08,"max_avail":137663873024}}]} 2026-03-31T20:25:08.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:752: test_mon_misc: grep -v dirty /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:25:08.632 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:25:08.632 INFO:tasks.workunit.client.0.vm03.stdout:{"stats":{"total_bytes":289910292480,"total_avail_bytes":289823539200,"total_used_bytes":86753280,"total_used_raw_bytes":86753280,"total_used_raw_ratio":0.00029924180125817657,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3},"stats_by_class":{"hdd":{"total_bytes":289910292480,"total_avail_bytes":289823539200,"total_used_bytes":86753280,"total_used_raw_bytes":86753280,"total_used_raw_ratio":0.00029924180125817657}},"pools":[{"name":".mgr","id":1,"stats":{"stored":459280,"objects":2,"kb_used":464,"bytes_used":475136,"percent_used":1.7257076478927047e-06,"max_avail":137663873024}},{"name":"rbd","id":2,"stats":{"stored":19,"objects":2,"kb_used":8,"bytes_used":8192,"percent_used":2.9753630670370512e-08,"max_avail":137663873024}}]} 2026-03-31T20:25:08.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:753: test_mon_misc: ceph df detail --format json 2026-03-31T20:25:08.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:754: test_mon_misc: grep rd_bytes /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:25:08.900 INFO:tasks.workunit.client.0.vm03.stdout:{"stats":{"total_bytes":289910292480,"total_avail_bytes":289823539200,"total_used_bytes":86753280,"total_used_raw_bytes":86753280,"total_used_raw_ratio":0.00029924180125817657,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3},"stats_by_class":{"hdd":{"total_bytes":289910292480,"total_avail_bytes":289823539200,"total_used_bytes":86753280,"total_used_raw_bytes":86753280,"total_used_raw_ratio":0.00029924180125817657}},"pools":[{"name":".mgr","id":1,"stats":{"stored":459280,"stored_data":459280,"stored_omap":0,"objects":2,"kb_used":464,"bytes_used":475136,"data_bytes_used":475136,"omap_bytes_used":0,"percent_used":1.7257076478927047e-06,"max_avail":137663873024,"quota_objects":0,"quota_bytes":0,"dirty":0,"rd":46,"rd_bytes":37888,"wr":57,"wr_bytes":598016,"compress_bytes_used":466944,"compress_under_bytes":910368,"stored_raw":918560,"avail_raw":275327754866}},{"name":"rbd","id":2,"stats":{"stored":19,"stored_data":19,"stored_omap":0,"objects":2,"kb_used":8,"bytes_used":8192,"data_bytes_used":8192,"omap_bytes_used":0,"percent_used":2.9753630670370512e-08,"max_avail":137663873024,"quota_objects":0,"quota_bytes":0,"dirty":0,"rd":0,"rd_bytes":0,"wr":2,"wr_bytes":2048,"compress_bytes_used":0,"compress_under_bytes":0,"stored_raw":38,"avail_raw":275327754866}}]} 2026-03-31T20:25:08.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:755: test_mon_misc: grep dirty /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:25:08.900 INFO:tasks.workunit.client.0.vm03.stdout:{"stats":{"total_bytes":289910292480,"total_avail_bytes":289823539200,"total_used_bytes":86753280,"total_used_raw_bytes":86753280,"total_used_raw_ratio":0.00029924180125817657,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3},"stats_by_class":{"hdd":{"total_bytes":289910292480,"total_avail_bytes":289823539200,"total_used_bytes":86753280,"total_used_raw_bytes":86753280,"total_used_raw_ratio":0.00029924180125817657}},"pools":[{"name":".mgr","id":1,"stats":{"stored":459280,"stored_data":459280,"stored_omap":0,"objects":2,"kb_used":464,"bytes_used":475136,"data_bytes_used":475136,"omap_bytes_used":0,"percent_used":1.7257076478927047e-06,"max_avail":137663873024,"quota_objects":0,"quota_bytes":0,"dirty":0,"rd":46,"rd_bytes":37888,"wr":57,"wr_bytes":598016,"compress_bytes_used":466944,"compress_under_bytes":910368,"stored_raw":918560,"avail_raw":275327754866}},{"name":"rbd","id":2,"stats":{"stored":19,"stored_data":19,"stored_omap":0,"objects":2,"kb_used":8,"bytes_used":8192,"data_bytes_used":8192,"omap_bytes_used":0,"percent_used":2.9753630670370512e-08,"max_avail":137663873024,"quota_objects":0,"quota_bytes":0,"dirty":0,"rd":0,"rd_bytes":0,"wr":2,"wr_bytes":2048,"compress_bytes_used":0,"compress_under_bytes":0,"stored_raw":38,"avail_raw":275327754866}}]} 2026-03-31T20:25:08.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:756: test_mon_misc: ceph df --format xml 2026-03-31T20:25:08.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:756: test_mon_misc: grep '' 2026-03-31T20:25:09.170 INFO:tasks.workunit.client.0.vm03.stdout:28991029248028982352281686769664867696640.0002992983208969235433328991029248028982352281686769664867696640.00029929832089692354.mgr145928024644751361.7257076478927047e-06137663873024rbd2192881922.9753630670370512e-08137663873024 2026-03-31T20:25:09.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:757: test_mon_misc: ceph df detail --format xml 2026-03-31T20:25:09.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:757: test_mon_misc: grep '' 2026-03-31T20:25:09.433 INFO:tasks.workunit.client.0.vm03.stdout:28991029248028982352281686769664867696640.0002992983208969235433328991029248028982352281686769664867696640.00029929832089692354.mgr14592804592800246447513647513601.7257076478927047e-06137663873024000463788857598016466944910368918560275327754866rbd219190288192819202.9753630670370512e-0813766387302400000220480038275327754866 2026-03-31T20:25:09.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:759: test_mon_misc: ceph fsid 2026-03-31T20:25:09.686 INFO:tasks.workunit.client.0.vm03.stdout:a4a0ca01-ae82-443e-a7c7-50605716689a 2026-03-31T20:25:09.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:760: test_mon_misc: ceph health 2026-03-31T20:25:09.937 INFO:tasks.workunit.client.0.vm03.stdout:HEALTH_OK 2026-03-31T20:25:09.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:761: test_mon_misc: ceph health detail 2026-03-31T20:25:10.216 INFO:tasks.workunit.client.0.vm03.stdout:HEALTH_OK 2026-03-31T20:25:10.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:762: test_mon_misc: ceph health --format json-pretty 2026-03-31T20:25:10.473 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:25:10.473 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:25:10.473 INFO:tasks.workunit.client.0.vm03.stdout: "status": "HEALTH_OK", 2026-03-31T20:25:10.473 INFO:tasks.workunit.client.0.vm03.stdout: "checks": {}, 2026-03-31T20:25:10.473 INFO:tasks.workunit.client.0.vm03.stdout: "mutes": [] 2026-03-31T20:25:10.473 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:25:10.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:763: test_mon_misc: ceph health detail --format xml-pretty 2026-03-31T20:25:10.747 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:25:10.747 INFO:tasks.workunit.client.0.vm03.stdout: HEALTH_OK 2026-03-31T20:25:10.747 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:25:10.747 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:25:10.747 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:25:10.747 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:25:10.747 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:25:10.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:765: test_mon_misc: ceph time-sync-status 2026-03-31T20:25:11.024 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:25:11.024 INFO:tasks.workunit.client.0.vm03.stdout: "time_skew_status": { 2026-03-31T20:25:11.024 INFO:tasks.workunit.client.0.vm03.stdout: "a": { 2026-03-31T20:25:11.024 INFO:tasks.workunit.client.0.vm03.stdout: "skew": 0, 2026-03-31T20:25:11.024 INFO:tasks.workunit.client.0.vm03.stdout: "latency": 0, 2026-03-31T20:25:11.024 INFO:tasks.workunit.client.0.vm03.stdout: "health": "HEALTH_OK" 2026-03-31T20:25:11.024 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:11.024 INFO:tasks.workunit.client.0.vm03.stdout: "b": { 2026-03-31T20:25:11.024 INFO:tasks.workunit.client.0.vm03.stdout: "skew": 0, 2026-03-31T20:25:11.024 INFO:tasks.workunit.client.0.vm03.stdout: "latency": 0.0023914940000000001, 2026-03-31T20:25:11.024 INFO:tasks.workunit.client.0.vm03.stdout: "health": "HEALTH_OK" 2026-03-31T20:25:11.024 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:11.024 INFO:tasks.workunit.client.0.vm03.stdout: "c": { 2026-03-31T20:25:11.024 INFO:tasks.workunit.client.0.vm03.stdout: "skew": 0, 2026-03-31T20:25:11.024 INFO:tasks.workunit.client.0.vm03.stdout: "latency": 0.00065781100000000001, 2026-03-31T20:25:11.024 INFO:tasks.workunit.client.0.vm03.stdout: "health": "HEALTH_OK" 2026-03-31T20:25:11.024 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:11.024 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:11.024 INFO:tasks.workunit.client.0.vm03.stdout: "timechecks": { 2026-03-31T20:25:11.024 INFO:tasks.workunit.client.0.vm03.stdout: "epoch": 4, 2026-03-31T20:25:11.024 INFO:tasks.workunit.client.0.vm03.stdout: "round": 2, 2026-03-31T20:25:11.024 INFO:tasks.workunit.client.0.vm03.stdout: "round_status": "finished" 2026-03-31T20:25:11.024 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:11.024 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:25:11.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:767: test_mon_misc: ceph node ls 2026-03-31T20:25:11.300 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:25:11.300 INFO:tasks.workunit.client.0.vm03.stdout: "mon": { 2026-03-31T20:25:11.300 INFO:tasks.workunit.client.0.vm03.stdout: "vm03": [ 2026-03-31T20:25:11.300 INFO:tasks.workunit.client.0.vm03.stdout: "a", 2026-03-31T20:25:11.300 INFO:tasks.workunit.client.0.vm03.stdout: "b", 2026-03-31T20:25:11.300 INFO:tasks.workunit.client.0.vm03.stdout: "c" 2026-03-31T20:25:11.300 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:25:11.300 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:11.300 INFO:tasks.workunit.client.0.vm03.stdout: "osd": { 2026-03-31T20:25:11.300 INFO:tasks.workunit.client.0.vm03.stdout: "vm03": [ 2026-03-31T20:25:11.300 INFO:tasks.workunit.client.0.vm03.stdout: 0, 2026-03-31T20:25:11.300 INFO:tasks.workunit.client.0.vm03.stdout: 1, 2026-03-31T20:25:11.300 INFO:tasks.workunit.client.0.vm03.stdout: 2 2026-03-31T20:25:11.300 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:25:11.300 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:11.300 INFO:tasks.workunit.client.0.vm03.stdout: "mgr": { 2026-03-31T20:25:11.300 INFO:tasks.workunit.client.0.vm03.stdout: "vm03": [ 2026-03-31T20:25:11.300 INFO:tasks.workunit.client.0.vm03.stdout: "x" 2026-03-31T20:25:11.300 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:25:11.300 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:11.300 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:25:11.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:768: test_mon_misc: for t in mon osd mds mgr 2026-03-31T20:25:11.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:769: test_mon_misc: ceph node ls mon 2026-03-31T20:25:11.582 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:25:11.582 INFO:tasks.workunit.client.0.vm03.stdout: "vm03": [ 2026-03-31T20:25:11.582 INFO:tasks.workunit.client.0.vm03.stdout: "a", 2026-03-31T20:25:11.582 INFO:tasks.workunit.client.0.vm03.stdout: "b", 2026-03-31T20:25:11.582 INFO:tasks.workunit.client.0.vm03.stdout: "c" 2026-03-31T20:25:11.582 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:25:11.582 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:25:11.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:768: test_mon_misc: for t in mon osd mds mgr 2026-03-31T20:25:11.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:769: test_mon_misc: ceph node ls osd 2026-03-31T20:25:11.867 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:25:11.867 INFO:tasks.workunit.client.0.vm03.stdout: "vm03": [ 2026-03-31T20:25:11.867 INFO:tasks.workunit.client.0.vm03.stdout: 0, 2026-03-31T20:25:11.867 INFO:tasks.workunit.client.0.vm03.stdout: 1, 2026-03-31T20:25:11.867 INFO:tasks.workunit.client.0.vm03.stdout: 2 2026-03-31T20:25:11.867 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:25:11.867 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:25:11.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:768: test_mon_misc: for t in mon osd mds mgr 2026-03-31T20:25:11.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:769: test_mon_misc: ceph node ls mds 2026-03-31T20:25:12.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:768: test_mon_misc: for t in mon osd mds mgr 2026-03-31T20:25:12.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:769: test_mon_misc: ceph node ls mgr 2026-03-31T20:25:12.424 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:25:12.424 INFO:tasks.workunit.client.0.vm03.stdout: "vm03": [ 2026-03-31T20:25:12.424 INFO:tasks.workunit.client.0.vm03.stdout: "x" 2026-03-31T20:25:12.424 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:25:12.424 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:25:12.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:772: test_mon_misc: ceph_watch_start 2026-03-31T20:25:12.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:155: ceph_watch_start: local whatch_opt=--watch 2026-03-31T20:25:12.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:157: ceph_watch_start: '[' -n '' ']' 2026-03-31T20:25:12.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:164: ceph_watch_start: CEPH_WATCH_FILE=/tmp/cephtool.sYl/CEPH_WATCH_26274 2026-03-31T20:25:12.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:166: ceph_watch_start: CEPH_WATCH_PID=34356 2026-03-31T20:25:12.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:165: ceph_watch_start: ceph --watch 2026-03-31T20:25:12.437 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:170: ceph_watch_start: seq 3 2026-03-31T20:25:12.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:170: ceph_watch_start: for i in `seq 3` 2026-03-31T20:25:12.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:171: ceph_watch_start: grep -q cluster /tmp/cephtool.sYl/CEPH_WATCH_26274 2026-03-31T20:25:12.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:172: ceph_watch_start: sleep 1 2026-03-31T20:25:13.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:170: ceph_watch_start: for i in `seq 3` 2026-03-31T20:25:13.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:171: ceph_watch_start: grep -q cluster /tmp/cephtool.sYl/CEPH_WATCH_26274 2026-03-31T20:25:13.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:172: ceph_watch_start: sleep 1 2026-03-31T20:25:14.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:170: ceph_watch_start: for i in `seq 3` 2026-03-31T20:25:14.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:171: ceph_watch_start: grep -q cluster /tmp/cephtool.sYl/CEPH_WATCH_26274 2026-03-31T20:25:14.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:172: ceph_watch_start: sleep 1 2026-03-31T20:25:15.445 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:773: test_mon_misc: date 2026-03-31T20:25:15.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:773: test_mon_misc: mymsg='this is a test log message 26274.Tue Mar 31 20:25:15 UTC 2026' 2026-03-31T20:25:15.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:774: test_mon_misc: ceph log 'this is a test log message 26274.Tue Mar 31 20:25:15 UTC 2026' 2026-03-31T20:25:16.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:775: test_mon_misc: ceph log last 2026-03-31T20:25:16.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:775: test_mon_misc: grep 'this is a test log message 26274.Tue Mar 31 20:25:15 UTC 2026' 2026-03-31T20:25:17.254 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-31T20:25:15.741741+0000 client.admin (client.?) 0 : cluster [INF] this is a test log message 26274.Tue Mar 31 20:25:15 UTC 2026 2026-03-31T20:25:17.254 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-31T20:25:16.032292+0000 client.admin (client.?) 0 : cluster [INF] this is a test log message 26274.Tue Mar 31 20:25:15 UTC 2026 2026-03-31T20:25:17.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:776: test_mon_misc: ceph log last 100 2026-03-31T20:25:17.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:776: test_mon_misc: grep 'this is a test log message 26274.Tue Mar 31 20:25:15 UTC 2026' 2026-03-31T20:25:17.538 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-31T20:25:15.741741+0000 client.admin (client.?) 0 : cluster [INF] this is a test log message 26274.Tue Mar 31 20:25:15 UTC 2026 2026-03-31T20:25:17.538 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-31T20:25:16.032292+0000 client.admin (client.?) 0 : cluster [INF] this is a test log message 26274.Tue Mar 31 20:25:15 UTC 2026 2026-03-31T20:25:17.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:777: test_mon_misc: ceph_watch_wait 'this is a test log message 26274.Tue Mar 31 20:25:15 UTC 2026' 2026-03-31T20:25:17.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:178: ceph_watch_wait: local 'regexp=this is a test log message 26274.Tue Mar 31 20:25:15 UTC 2026' 2026-03-31T20:25:17.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:179: ceph_watch_wait: local timeout=30 2026-03-31T20:25:17.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:181: ceph_watch_wait: '[' -n '' ']' 2026-03-31T20:25:17.538 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:185: ceph_watch_wait: seq 30 2026-03-31T20:25:17.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:185: ceph_watch_wait: for i in `seq ${timeout}` 2026-03-31T20:25:17.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:186: ceph_watch_wait: grep -q 'this is a test log message 26274.Tue Mar 31 20:25:15 UTC 2026' /tmp/cephtool.sYl/CEPH_WATCH_26274 2026-03-31T20:25:17.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:186: ceph_watch_wait: break 2026-03-31T20:25:17.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:190: ceph_watch_wait: kill 34356 2026-03-31T20:25:17.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:192: ceph_watch_wait: grep 'this is a test log message 26274.Tue Mar 31 20:25:15 UTC 2026' /tmp/cephtool.sYl/CEPH_WATCH_26274 2026-03-31T20:25:17.541 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-31T20:25:15.741741+0000 client.admin [INF] this is a test log message 26274.Tue Mar 31 20:25:15 UTC 2026 2026-03-31T20:25:17.541 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-31T20:25:16.032292+0000 client.admin [INF] this is a test log message 26274.Tue Mar 31 20:25:15 UTC 2026 2026-03-31T20:25:17.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:779: test_mon_misc: ceph mgr stat 2026-03-31T20:25:17.804 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:25:17.804 INFO:tasks.workunit.client.0.vm03.stdout: "epoch": 5, 2026-03-31T20:25:17.804 INFO:tasks.workunit.client.0.vm03.stdout: "available": true, 2026-03-31T20:25:17.804 INFO:tasks.workunit.client.0.vm03.stdout: "active_name": "x", 2026-03-31T20:25:17.804 INFO:tasks.workunit.client.0.vm03.stdout: "num_standby": 0 2026-03-31T20:25:17.804 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:25:17.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:780: test_mon_misc: ceph mgr dump 2026-03-31T20:25:18.075 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "epoch": 5, 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "active_gid": 4119, 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "active_name": "x", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "active_addrs": { 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6812", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 293048097 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "active_addr": "192.168.123.103:6812/293048097", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "active_change": "2026-03-31T20:21:26.514949+0000", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "active_mgr_features": 4541880224203014143, 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "available": true, 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "standbys": [], 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "modules": [ 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "iostat", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "nfs" 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "available_modules": [ 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "name": "alerts", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "interval": { 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "name": "interval", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "type": "secs", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "60", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "How frequently to reexamine health status", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.076 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.077 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "smtp_destination": { 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "name": "smtp_destination", 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Email address to send alerts to, use commas to separate multiple", 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "smtp_from_name": { 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "name": "smtp_from_name", 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "Ceph", 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Email From: name", 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "smtp_host": { 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "name": "smtp_host", 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "SMTP server", 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.078 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "smtp_password": { 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "name": "smtp_password", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Password to authenticate with", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "smtp_port": { 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "name": "smtp_port", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "465", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "SMTP port", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "smtp_sender": { 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "name": "smtp_sender", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "SMTP envelope sender", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "smtp_ssl": { 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "name": "smtp_ssl", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Use SSL to connect to SMTP server", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "smtp_user": { 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "name": "smtp_user", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.079 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "User to authenticate as", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "name": "balancer", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "active": { 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "name": "active", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "automatically balance PGs across cluster", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "begin_time": { 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "name": "begin_time", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0000", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "beginning time of day to automatically balance", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "This is a time of day in the format HHMM.", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "begin_weekday": { 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "name": "begin_weekday", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "type": "uint", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.080 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "min": "0", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "max": "6", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Restrict automatic balancing to this day of the week or later", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "0 = Sunday, 1 = Monday, etc.", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "crush_compat_max_iterations": { 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "name": "crush_compat_max_iterations", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "type": "uint", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "25", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "min": "1", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "max": "250", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "maximum number of iterations to attempt optimization", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "crush_compat_metrics": { 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "name": "crush_compat_metrics", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "pgs,objects,bytes", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "metrics with which to calculate OSD utilization", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "Value is a list of one or more of \"pgs\", \"objects\", or \"bytes\", and indicates which metrics to use to balance utilization.", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "crush_compat_step": { 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "name": "crush_compat_step", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "type": "float", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0.5", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "min": "0.001", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "max": "0.999", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "aggressiveness of optimization", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": ".99 is very aggressive, .01 is less aggressive", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "end_time": { 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "name": "end_time", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "2359", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "ending time of day to automatically balance", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "This is a time of day in the format HHMM.", 2026-03-31T20:25:18.081 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "end_weekday": { 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "name": "end_weekday", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "type": "uint", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "min": "0", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "max": "6", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Restrict automatic balancing to days of the week earlier than this", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "0 = Sunday, 1 = Monday, etc.", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.082 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "min_score": { 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "name": "min_score", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "type": "float", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "minimum score, below which no optimization is attempted", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "mode": { 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "name": "mode", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "upmap", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "crush-compat", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "none", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "read", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "upmap", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "upmap-read" 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Balancer mode", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "pool_ids": { 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "name": "pool_ids", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.083 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "pools which the automatic balancing will be limited to", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "sleep_interval": { 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sleep_interval", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "type": "secs", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "60", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "how frequently to wake up and attempt optimization", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "update_pg_upmap_activity": { 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "name": "update_pg_upmap_activity", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Updates pg_upmap activity stats to be used in `balancer status detail`", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "upmap_max_deviation": { 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "name": "upmap_max_deviation", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "5", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "min": "1", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "deviation below which no optimization is attempted", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "If the number of PGs are within this count then no optimization is attempted", 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.084 INFO:tasks.workunit.client.0.vm03.stdout: "upmap_max_optimizations": { 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "name": "upmap_max_optimizations", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "type": "uint", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "10", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "maximum upmap optimizations to make per attempt", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "name": "cephadm", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "agent_down_multiplier": { 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "name": "agent_down_multiplier", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "type": "float", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "3.0", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Multiplied by agent refresh rate to calculate how long agent must not report before being marked down", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "agent_refresh_rate": { 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "name": "agent_refresh_rate", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "type": "secs", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "20", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "How often agent on each host will try to gather and send metadata", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "agent_starting_port": { 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "name": "agent_starting_port", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "4721", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "First port agent will try to bind to (will also try up to next 1000 subsequent ports if blocked)", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "allow_ptrace": { 2026-03-31T20:25:18.085 INFO:tasks.workunit.client.0.vm03.stdout: "name": "allow_ptrace", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "allow SYS_PTRACE capability on ceph containers", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "The SYS_PTRACE capability is needed to attach to a process with gdb or strace. Enabling this options can allow debugging daemons that encounter problems at runtime.", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "autotune_interval": { 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "name": "autotune_interval", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "type": "secs", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "600", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "how frequently to autotune daemon memory", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "autotune_memory_target_ratio": { 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "name": "autotune_memory_target_ratio", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "type": "float", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0.7", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "ratio of total system memory to divide amongst autotuned daemons", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "cephadm_log_destination": { 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "name": "cephadm_log_destination", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "file", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "file,syslog", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "syslog" 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Destination for cephadm command's persistent logging", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "certificate_automated_rotation_enabled": { 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "name": "certificate_automated_rotation_enabled", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.086 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "This flag controls whether cephadm automatically rotates certificates upon expiration.", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "certificate_check_debug_mode": { 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "name": "certificate_check_debug_mode", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "FOR TESTING ONLY: This flag forces the certificate check instead of waiting for certificate_check_period.", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "certificate_check_period": { 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "name": "certificate_check_period", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "1", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "min": "0", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "max": "30", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Specifies how often (in days) the certificate should be checked for validity.", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "certificate_duration_days": { 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "name": "certificate_duration_days", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "1095", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "min": "90", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "max": "3650", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Specifies the duration of self certificates generated and signed by cephadm root CA", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "certificate_renewal_threshold_days": { 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "name": "certificate_renewal_threshold_days", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "30", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "min": "10", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "max": "90", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Specifies the lead time in days to initiate certificate renewal before expiration.", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.087 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "cgroups_split": { 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "name": "cgroups_split", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Pass --cgroups=split when cephadm creates containers (currently podman only)", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "config_checks_enabled": { 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "name": "config_checks_enabled", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Enable or disable the cephadm configuration analysis", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "config_dashboard": { 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "name": "config_dashboard", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "manage configs like API endpoints in Dashboard.", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "container_image_alertmanager": { 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "name": "container_image_alertmanager", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "quay.io/prometheus/alertmanager:v0.28.1", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Alertmanager container image", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "container_image_base": { 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "name": "container_image_base", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "quay.io/ceph/ceph", 2026-03-31T20:25:18.088 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Container image name, without the tag", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "container_image_elasticsearch": { 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "name": "container_image_elasticsearch", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "quay.io/omrizeneva/elasticsearch:6.8.23", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Elasticsearch container image", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "container_image_grafana": { 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "name": "container_image_grafana", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "quay.io/ceph/grafana:12.3.1", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Grafana container image", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "container_image_haproxy": { 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "name": "container_image_haproxy", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "quay.io/ceph/haproxy:2.3", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Haproxy container image", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "container_image_jaeger_agent": { 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "name": "container_image_jaeger_agent", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "quay.io/jaegertracing/jaeger-agent:1.29", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Jaeger agent container image", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.089 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "container_image_jaeger_collector": { 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "name": "container_image_jaeger_collector", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "quay.io/jaegertracing/jaeger-collector:1.29", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Jaeger collector container image", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "container_image_jaeger_query": { 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "name": "container_image_jaeger_query", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "quay.io/jaegertracing/jaeger-query:1.29", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Jaeger query container image", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "container_image_keepalived": { 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "name": "container_image_keepalived", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "quay.io/ceph/keepalived:2.2.4", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Keepalived container image", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "container_image_loki": { 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "name": "container_image_loki", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.090 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "docker.io/grafana/loki:3.0.0", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Loki container image", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "container_image_nginx": { 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "name": "container_image_nginx", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "quay.io/ceph/nginx:sclorg-nginx-126", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Nginx container image", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "container_image_node_exporter": { 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "name": "container_image_node_exporter", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "quay.io/prometheus/node-exporter:v1.9.1", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Node exporter container image", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "container_image_nvmeof": { 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "name": "container_image_nvmeof", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "quay.io/ceph/nvmeof:1.5", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Nvmeof container image", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "container_image_oauth2_proxy": { 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "name": "container_image_oauth2_proxy", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "quay.io/oauth2-proxy/oauth2-proxy:v7.6.0", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Oauth2 proxy container image", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.091 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "container_image_prometheus": { 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "name": "container_image_prometheus", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "quay.io/prometheus/prometheus:v3.6.0", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Prometheus container image", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "container_image_promtail": { 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "name": "container_image_promtail", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "docker.io/grafana/promtail:3.0.0", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Promtail container image", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "container_image_samba": { 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "name": "container_image_samba", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "quay.io/samba.org/samba-server:ceph20-centos-amd64", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Samba container image", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "container_image_samba_metrics": { 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "name": "container_image_samba_metrics", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "quay.io/samba.org/samba-metrics:ceph20-centos-amd64", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Samba metrics container image", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "container_image_snmp_gateway": { 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "name": "container_image_snmp_gateway", 2026-03-31T20:25:18.092 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "docker.io/maxwo/snmp-notifier:v1.2.1", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Snmp gateway container image", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "container_init": { 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "name": "container_init", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Run podman/docker with `--init`", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "daemon_cache_timeout": { 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "name": "daemon_cache_timeout", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "type": "secs", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "600", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "seconds to cache service (daemon) inventory", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "default_cephadm_command_timeout": { 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "name": "default_cephadm_command_timeout", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "900", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Default timeout applied to cephadm commands run directly on the host (in seconds)", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "default_registry": { 2026-03-31T20:25:18.093 INFO:tasks.workunit.client.0.vm03.stdout: "name": "default_registry", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "quay.io", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Search-registry to which we should normalize unqualified image names. This is not the default registry", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "device_cache_timeout": { 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "name": "device_cache_timeout", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "type": "secs", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "1800", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "seconds to cache device inventory", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "device_enhanced_scan": { 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "name": "device_enhanced_scan", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Use libstoragemgmt during device scans", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "facts_cache_timeout": { 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "name": "facts_cache_timeout", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "type": "secs", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "60", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "seconds to cache host facts data", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.094 INFO:tasks.workunit.client.0.vm03.stdout: "grafana_dashboards_path": { 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "name": "grafana_dashboards_path", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "/etc/grafana/dashboards/ceph-dashboard/", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "location of dashboards to include in grafana deployments", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "host_check_interval": { 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "name": "host_check_interval", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "type": "secs", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "600", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "how frequently to perform a host check", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "hw_monitoring": { 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "name": "hw_monitoring", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Deploy hw monitoring daemon on every host.", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "inventory_list_all": { 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "name": "inventory_list_all", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Whether ceph-volume inventory should report more devices (mostly mappers (LVs / mpaths), partitions...)", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.095 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "log_refresh_metadata": { 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_refresh_metadata", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Log all refresh metadata. Includes daemon, device, and host info collected regularly. Only has effect if logging at debug level", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "log to the \"cephadm\" cluster log channel\"", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.096 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "manage_etc_ceph_ceph_conf": { 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "name": "manage_etc_ceph_ceph_conf", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Manage and own /etc/ceph/ceph.conf on the hosts.", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "manage_etc_ceph_ceph_conf_hosts": { 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "name": "manage_etc_ceph_ceph_conf_hosts", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "*", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "PlacementSpec describing on which hosts to manage /etc/ceph/ceph.conf", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "max_count_per_host": { 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "name": "max_count_per_host", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "10", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "max number of daemons per service per host", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "max_osd_draining_count": { 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "name": "max_osd_draining_count", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "10", 2026-03-31T20:25:18.097 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "max number of osds that will be drained simultaneously when osds are removed", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "migration_current": { 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "name": "migration_current", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "internal - do not modify", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "mode": { 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "name": "mode", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "root", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "cephadm-package", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "root" 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "mode for remote execution of cephadm", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "oob_default_addr": { 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "name": "oob_default_addr", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "169.254.1.1", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Default address for RedFish API (oob management).", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "prometheus_alerts_path": { 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "name": "prometheus_alerts_path", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "/etc/prometheus/ceph/ceph_default_alerts.yml", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.098 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "location of alerts to include in prometheus deployments", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "registry_insecure": { 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "name": "registry_insecure", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Registry is to be considered insecure (no TLS available). Only for development purposes.", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "registry_password": { 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "name": "registry_password", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Custom repository password. Only used for logging into a registry.", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "registry_url": { 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "name": "registry_url", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Registry url for login purposes. This is not the default registry", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "registry_username": { 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "name": "registry_username", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Custom repository username. Only used for logging into a registry.", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "secure_monitoring_stack": { 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "name": "secure_monitoring_stack", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.099 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Enable TLS security for all the monitoring stack daemons", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "service_discovery_port": { 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "name": "service_discovery_port", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "8765", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "cephadm service discovery port", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "ssh_config_file": { 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "name": "ssh_config_file", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "customized SSH config file to connect to managed hosts", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "ssh_keepalive_count_max": { 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "name": "ssh_keepalive_count_max", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "3", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.100 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "How many times ssh connections can fail liveness checks before the host is marked offline", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "ssh_keepalive_interval": { 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "name": "ssh_keepalive_interval", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "7", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "How often ssh connections are checked for liveness", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "stray_daemon_check_interval": { 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "name": "stray_daemon_check_interval", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "type": "secs", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "1800", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "how frequently cephadm should check for the presence of stray daemons", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "use_agent": { 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "name": "use_agent", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Use cephadm agent on each host to gather and send metadata", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "use_repo_digest": { 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "name": "use_repo_digest", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Automatically convert image tags to image digest. Make sure all daemons use the same image", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "warn_on_failed_host_check": { 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "name": "warn_on_failed_host_check", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.101 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "raise a health warning if the host check fails", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "warn_on_stray_daemons": { 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "name": "warn_on_stray_daemons", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "raise a health warning if daemons are detected that are not managed by cephadm", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "warn_on_stray_hosts": { 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "name": "warn_on_stray_hosts", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "raise a health warning if daemons are detected on a host that is not managed by cephadm", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "name": "crash", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.102 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "retain_interval": { 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "name": "retain_interval", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "type": "secs", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "31536000", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "how long to retain crashes before pruning them", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.103 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "warn_recent_interval": { 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "name": "warn_recent_interval", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "type": "secs", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "1209600", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "time interval in which to warn about recent crashes", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "name": "dashboard", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "ACCOUNT_LOCKOUT_ATTEMPTS": { 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "name": "ACCOUNT_LOCKOUT_ATTEMPTS", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "10", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "ALERTMANAGER_API_HOST": { 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "name": "ALERTMANAGER_API_HOST", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.104 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "ALERTMANAGER_API_SSL_VERIFY": { 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "name": "ALERTMANAGER_API_SSL_VERIFY", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "AUDIT_API_ENABLED": { 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "name": "AUDIT_API_ENABLED", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "AUDIT_API_LOG_PAYLOAD": { 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "name": "AUDIT_API_LOG_PAYLOAD", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "ENABLE_BROWSABLE_API": { 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "name": "ENABLE_BROWSABLE_API", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "FEATURE_TOGGLE_CEPHFS": { 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "name": "FEATURE_TOGGLE_CEPHFS", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.105 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "FEATURE_TOGGLE_DASHBOARD": { 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "name": "FEATURE_TOGGLE_DASHBOARD", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "FEATURE_TOGGLE_ISCSI": { 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "name": "FEATURE_TOGGLE_ISCSI", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "FEATURE_TOGGLE_MIRRORING": { 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "name": "FEATURE_TOGGLE_MIRRORING", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "FEATURE_TOGGLE_NFS": { 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "name": "FEATURE_TOGGLE_NFS", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "FEATURE_TOGGLE_RBD": { 2026-03-31T20:25:18.106 INFO:tasks.workunit.client.0.vm03.stdout: "name": "FEATURE_TOGGLE_RBD", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "FEATURE_TOGGLE_RGW": { 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "name": "FEATURE_TOGGLE_RGW", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE": { 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "name": "GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "GRAFANA_API_PASSWORD": { 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "name": "GRAFANA_API_PASSWORD", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "admin", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "GRAFANA_API_SSL_VERIFY": { 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "name": "GRAFANA_API_SSL_VERIFY", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.107 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "GRAFANA_API_URL": { 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "name": "GRAFANA_API_URL", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "GRAFANA_API_USERNAME": { 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "name": "GRAFANA_API_USERNAME", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "admin", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "GRAFANA_FRONTEND_API_URL": { 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "name": "GRAFANA_FRONTEND_API_URL", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "GRAFANA_UPDATE_DASHBOARDS": { 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "name": "GRAFANA_UPDATE_DASHBOARDS", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "ISCSI_API_SSL_VERIFICATION": { 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "name": "ISCSI_API_SSL_VERIFICATION", 2026-03-31T20:25:18.108 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "ISSUE_TRACKER_API_KEY": { 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "name": "ISSUE_TRACKER_API_KEY", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "MANAGED_BY_CLUSTERS": { 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "name": "MANAGED_BY_CLUSTERS", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "[]", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "MULTICLUSTER_CONFIG": { 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "name": "MULTICLUSTER_CONFIG", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "{}", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "PROMETHEUS_API_HOST": { 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "name": "PROMETHEUS_API_HOST", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.109 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "PROMETHEUS_API_SSL_VERIFY": { 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "name": "PROMETHEUS_API_SSL_VERIFY", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "PROM_ALERT_CREDENTIAL_CACHE_TTL": { 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "name": "PROM_ALERT_CREDENTIAL_CACHE_TTL", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "60", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "PWD_POLICY_CHECK_COMPLEXITY_ENABLED": { 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "name": "PWD_POLICY_CHECK_COMPLEXITY_ENABLED", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED": { 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "name": "PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "PWD_POLICY_CHECK_LENGTH_ENABLED": { 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "name": "PWD_POLICY_CHECK_LENGTH_ENABLED", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.110 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "PWD_POLICY_CHECK_OLDPWD_ENABLED": { 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "name": "PWD_POLICY_CHECK_OLDPWD_ENABLED", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED": { 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "name": "PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED": { 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "name": "PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "PWD_POLICY_CHECK_USERNAME_ENABLED": { 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "name": "PWD_POLICY_CHECK_USERNAME_ENABLED", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.111 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "PWD_POLICY_ENABLED": { 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "name": "PWD_POLICY_ENABLED", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "PWD_POLICY_EXCLUSION_LIST": { 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "name": "PWD_POLICY_EXCLUSION_LIST", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "osd,host,dashboard,pool,block,nfs,ceph,monitors,gateway,logs,crush,maps", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "PWD_POLICY_MIN_COMPLEXITY": { 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "name": "PWD_POLICY_MIN_COMPLEXITY", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "10", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "PWD_POLICY_MIN_LENGTH": { 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "name": "PWD_POLICY_MIN_LENGTH", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "8", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "REST_REQUESTS_TIMEOUT": { 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "name": "REST_REQUESTS_TIMEOUT", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.112 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "45", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "RGW_API_ACCESS_KEY": { 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "name": "RGW_API_ACCESS_KEY", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "RGW_API_ADMIN_RESOURCE": { 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "name": "RGW_API_ADMIN_RESOURCE", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "admin", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "RGW_API_SECRET_KEY": { 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "name": "RGW_API_SECRET_KEY", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "RGW_API_SSL_VERIFY": { 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "name": "RGW_API_SSL_VERIFY", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.113 INFO:tasks.workunit.client.0.vm03.stdout: "RGW_HOSTNAME_PER_DAEMON": { 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "name": "RGW_HOSTNAME_PER_DAEMON", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "UNSAFE_TLS_v1_2": { 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "name": "UNSAFE_TLS_v1_2", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "USER_PWD_EXPIRATION_SPAN": { 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "name": "USER_PWD_EXPIRATION_SPAN", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "USER_PWD_EXPIRATION_WARNING_1": { 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "name": "USER_PWD_EXPIRATION_WARNING_1", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "10", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "USER_PWD_EXPIRATION_WARNING_2": { 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "name": "USER_PWD_EXPIRATION_WARNING_2", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "5", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.114 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "cross_origin_url": { 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "name": "cross_origin_url", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "crt_file": { 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "name": "crt_file", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "crypto_caller": { 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "name": "crypto_caller", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "debug": { 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "name": "debug", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Enable/disable debug options", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "jwt_token_ttl": { 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "name": "jwt_token_ttl", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.115 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "28800", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "key_file": { 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "name": "key_file", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.116 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "motd": { 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "name": "motd", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "The message of the day", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "redirect_resolve_ip_addr": { 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "name": "redirect_resolve_ip_addr", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "server_addr": { 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "name": "server_addr", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.117 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "::", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "server_port": { 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "name": "server_port", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "8080", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "ssl": { 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "name": "ssl", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "ssl_server_port": { 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "name": "ssl_server_port", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "8443", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.118 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "sso_oauth2": { 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sso_oauth2", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "standby_behaviour": { 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "name": "standby_behaviour", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "redirect", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "redirect" 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "standby_error_status_code": { 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "name": "standby_error_status_code", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "500", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "min": "400", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "max": "599", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "url_prefix": { 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "name": "url_prefix", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "name": "devicehealth", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.119 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "enable_monitoring": { 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "name": "enable_monitoring", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "monitor device health metrics", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.120 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "mark_out_threshold": { 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "name": "mark_out_threshold", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "type": "secs", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "2419200", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "automatically mark OSD if it may fail before this long", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "pool_name": { 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "name": "pool_name", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "device_health_metrics", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "name of pool in which to store device health metrics", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "retention_period": { 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "name": "retention_period", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "type": "secs", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "15552000", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "how long to retain device health metrics", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.121 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "scrape_frequency": { 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "name": "scrape_frequency", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "type": "secs", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "86400", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "how frequently to scrape device health metrics", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "self_heal": { 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "name": "self_heal", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "preemptively heal cluster around devices that may fail", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "sleep_interval": { 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sleep_interval", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "type": "secs", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "600", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "how frequently to wake up and check device health", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "warn_threshold": { 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "name": "warn_threshold", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "type": "secs", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "7257600", 2026-03-31T20:25:18.122 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "raise health warning if OSD may fail before this long", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "name": "diskprediction_local", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.123 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "predict_interval": { 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "name": "predict_interval", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "86400", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "predictor_model": { 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "name": "predictor_model", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "prophetstor", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "sleep_interval": { 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sleep_interval", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "600", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.124 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "name": "influx", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": false, 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "influxdb python module not found", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "batch_size": { 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "name": "batch_size", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "5000", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "How big batches of data points should be when sending to InfluxDB.", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "database": { 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "name": "database", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "ceph", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "InfluxDB database name. You will need to create this database and grant write privileges to the configured username or the username must have admin privileges to create it.", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "hostname": { 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "name": "hostname", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "InfluxDB server hostname", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "interval": { 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "name": "interval", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "type": "secs", 2026-03-31T20:25:18.125 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "30", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "min": "5", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Time between reports to InfluxDB. Default 30 seconds.", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.126 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "password": { 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "name": "password", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "password of InfluxDB server user", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "port": { 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "name": "port", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "8086", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "InfluxDB server port", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "ssl": { 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "name": "ssl", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "false", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.127 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Use https connection for InfluxDB server. Use \"true\" or \"false\".", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "threads": { 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "name": "threads", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "5", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "min": "1", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "max": "32", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "How many worker threads should be spawned for sending data to InfluxDB.", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "username": { 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "name": "username", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "username of InfluxDB server user", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "verify_ssl": { 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "name": "verify_ssl", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "true", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Verify https cert for InfluxDB server. Use \"true\" or \"false\".", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "name": "insights", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.128 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.129 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "name": "iostat", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.130 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "name": "localpool", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "failure_domain": { 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "name": "failure_domain", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "host", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "failure domain for any created local pool", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "what failure domain we should separate data replicas across.", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.131 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.132 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "min_size": { 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "name": "min_size", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "default min_size for any created local pool", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "value to set min_size to (unchanged from Ceph's default if this option is not set)", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "num_rep": { 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "name": "num_rep", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "3", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "default replica count for any created local pool", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "pg_num": { 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "name": "pg_num", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "128", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "default pg_num for any created local pool", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "prefix": { 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "name": "prefix", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "name prefix for any created local pool", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.133 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "subtree": { 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "name": "subtree", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "rack", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "CRUSH level for which to create a local pool", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "which CRUSH subtree type the module should create a pool for.", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "name": "mirroring", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.134 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "name": "nfs", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.135 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.136 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "name": "orchestrator", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "fail_fs": { 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "name": "fail_fs", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Fail filesystem for rapid multi-rank mds upgrade", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.137 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "orchestrator": { 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "name": "orchestrator", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "cephadm", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "rook", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "test_orchestrator" 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Orchestrator backend", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "name": "osd_perf_query", 2026-03-31T20:25:18.138 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.139 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "name": "osd_support", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.140 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "name": "pg_autoscaler", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.141 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.142 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "sleep_interval": { 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sleep_interval", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "type": "secs", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "60", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "threshold": { 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "name": "threshold", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "type": "float", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "3.0", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "min": "1.0", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "scaling threshold", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "The factor by which the `NEW PG_NUM` must vary from the current`PG_NUM` before being accepted. Cannot be less than 1.0", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "name": "progress", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "allow_pg_recovery_event": { 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "name": "allow_pg_recovery_event", 2026-03-31T20:25:18.143 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "allow the module to show pg recovery progress", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "enabled": { 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "name": "enabled", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.144 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "max_completed_events": { 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "name": "max_completed_events", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "50", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "number of past completed events to remember", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "sleep_interval": { 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sleep_interval", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "type": "secs", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "5", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "how long the module is going to sleep", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.145 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "name": "prometheus", 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "cache": { 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "name": "cache", 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "exclude_perf_counters": { 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "name": "exclude_perf_counters", 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Do not include perf-counters in the metrics output", 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "Gathering perf-counters from a single Prometheus exporter can degrade ceph-mgr performance, especially in large clusters. Instead, Ceph-exporter daemons are now used by default for perf-counter gathering. This should only be disabled when no ceph-exporters are deployed.", 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.146 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.147 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "rbd_stats_pools": { 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "name": "rbd_stats_pools", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "rbd_stats_pools_refresh_interval": { 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "name": "rbd_stats_pools_refresh_interval", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "300", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.148 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "scrape_interval": { 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "name": "scrape_interval", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "type": "float", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "15.0", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "server_addr": { 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "name": "server_addr", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "::", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "the IPv4 or IPv6 address on which the module listens for HTTP requests", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "server_port": { 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "name": "server_port", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "9283", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "the port on which the module listens for HTTP requests", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "stale_cache_strategy": { 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "name": "stale_cache_strategy", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "log", 2026-03-31T20:25:18.149 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "standby_behaviour": { 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "name": "standby_behaviour", 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "default", 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "default", 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "error" 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "standby_error_status_code": { 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "name": "standby_error_status_code", 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "500", 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "min": "400", 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "max": "599", 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "name": "rbd_support", 2026-03-31T20:25:18.150 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.151 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "max_concurrent_snap_create": { 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "name": "max_concurrent_snap_create", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "10", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "mirror_snapshot_schedule": { 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "name": "mirror_snapshot_schedule", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.152 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "trash_purge_schedule": { 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "name": "trash_purge_schedule", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "name": "rgw", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.153 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "secondary_zone_period_retry_limit": { 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "name": "secondary_zone_period_retry_limit", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "5", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "RGW module period update retry limit for secondary site", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.154 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "name": "selftest", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.155 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "roption1": { 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "name": "roption1", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "roption2": { 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "name": "roption2", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "xyz", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "rwoption1": { 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "name": "rwoption1", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.156 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "rwoption2": { 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "name": "rwoption2", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "rwoption3": { 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "name": "rwoption3", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "type": "float", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "rwoption4": { 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "name": "rwoption4", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "rwoption5": { 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "name": "rwoption5", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.157 INFO:tasks.workunit.client.0.vm03.stdout: "rwoption6": { 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "name": "rwoption6", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "rwoption7": { 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "name": "rwoption7", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "min": "1", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "max": "42", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "testkey": { 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "name": "testkey", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "testlkey": { 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "name": "testlkey", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.158 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "testnewline": { 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "name": "testnewline", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "name": "snap_schedule", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "allow_m_granularity": { 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "name": "allow_m_granularity", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "allow minute scheduled snapshots", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "dump_on_update": { 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "name": "dump_on_update", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "dump database to debug log on update", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.159 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.160 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "name": "stats", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.161 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "name": "status", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.162 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "name": "telegraf", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "address": { 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "name": "address", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.163 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "unixgram:///tmp/telegraf.sock", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "interval": { 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "name": "interval", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "type": "secs", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "15", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.164 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "name": "telemetry", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "channel_basic": { 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "name": "channel_basic", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Share basic cluster information (size, version)", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "channel_crash": { 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "name": "channel_crash", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.165 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Share metadata about Ceph daemon crashes (version, stack straces, etc)", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "channel_device": { 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "name": "channel_device", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Share device health metrics (e.g., SMART data, minus potentially identifying info like serial numbers)", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "channel_ident": { 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "name": "channel_ident", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Share a user-provided description and/or contact email for the cluster", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "channel_perf": { 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "name": "channel_perf", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Share various performance metrics of a cluster", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "contact": { 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "name": "contact", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.166 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "description": { 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "name": "description", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "device_url": { 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "name": "device_url", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "https://telemetry.ceph.com/device", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "enabled": { 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "name": "enabled", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "interval": { 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "name": "interval", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "24", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "min": "8", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "last_opt_revision": { 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "name": "last_opt_revision", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "1", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.167 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "leaderboard": { 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "name": "leaderboard", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "leaderboard_description": { 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "name": "leaderboard_description", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.168 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "organization": { 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "name": "organization", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "proxy": { 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "name": "proxy", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.169 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "url": { 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "name": "url", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "https://telemetry.ceph.com/report", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "name": "test_orchestrator", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.170 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "name": "volumes", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "can_run": true, 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "error_string": "", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "module_options": { 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "log_level": { 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_level", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "", 2026-03-31T20:25:18.171 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster": { 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_cluster_level": { 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_cluster_level", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "type": "str", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "info", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [ 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "critical", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "debug", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "error", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "info", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "warning" 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "log_to_file": { 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "name": "log_to_file", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "max_concurrent_clones": { 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "name": "max_concurrent_clones", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.172 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "4", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Number of asynchronous cloner threads", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "pause_cloning": { 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "name": "pause_cloning", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Pause asynchronous cloner threads", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "pause_purging": { 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "name": "pause_purging", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Pause asynchronous subvolume purge threads", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "periodic_async_work": { 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "name": "periodic_async_work", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "False", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Periodically check for async work", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "snapshot_clone_delay": { 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "name": "snapshot_clone_delay", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Delay clone begin operation by snapshot_clone_delay seconds", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "snapshot_clone_no_wait": { 2026-03-31T20:25:18.173 INFO:tasks.workunit.client.0.vm03.stdout: "name": "snapshot_clone_no_wait", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "type": "bool", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "level": "advanced", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 0, 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "True", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "Reject subvolume clone request when cloner threads are busy", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "sqlite3_killpoint": { 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "name": "sqlite3_killpoint", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "type": "int", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "level": "dev", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "flags": 1, 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "default_value": "0", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "min": "", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "max": "", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "enum_allowed": [], 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "desc": "", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "long_desc": "", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "tags": [], 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "see_also": [] 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "services": {}, 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "always_on_modules": { 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "octopus": [ 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "balancer", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "crash", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "devicehealth", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "orchestrator", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "pg_autoscaler", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "progress", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "rbd_support", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "status", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "telemetry", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "volumes" 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "pacific": [ 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "balancer", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "crash", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "devicehealth", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "orchestrator", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "pg_autoscaler", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "progress", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "rbd_support", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "status", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "telemetry", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "volumes" 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "quincy": [ 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "balancer", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "crash", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "devicehealth", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "orchestrator", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "pg_autoscaler", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "progress", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "rbd_support", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "status", 2026-03-31T20:25:18.174 INFO:tasks.workunit.client.0.vm03.stdout: "telemetry", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "volumes" 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "reef": [ 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "balancer", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "crash", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "devicehealth", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "orchestrator", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "pg_autoscaler", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "progress", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "rbd_support", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "status", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "telemetry", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "volumes" 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "squid": [ 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "balancer", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "crash", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "devicehealth", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "orchestrator", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "pg_autoscaler", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "progress", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "rbd_support", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "status", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "telemetry", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "volumes" 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "tentacle": [ 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "balancer", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "crash", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "devicehealth", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "orchestrator", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "pg_autoscaler", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "progress", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "rbd_support", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "status", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "telemetry", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "volumes" 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "force_disabled_modules": {}, 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "last_failure_osd_epoch": 0, 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "active_clients": [ 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "name": "devicehealth", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 4057350736 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "name": "libcephsqlite", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 3884125092 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "name": "rbd_support", 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.175 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:25:18.176 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:25:18.176 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 1536173869 2026-03-31T20:25:18.176 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.176 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:25:18.176 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:18.176 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.176 INFO:tasks.workunit.client.0.vm03.stdout: "name": "volumes", 2026-03-31T20:25:18.176 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:25:18.176 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:18.176 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:25:18.176 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:25:18.176 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 3367780566 2026-03-31T20:25:18.176 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.176 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:25:18.176 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:18.176 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:25:18.176 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:25:18.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:781: test_mon_misc: ceph mgr dump 2026-03-31T20:25:18.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:781: test_mon_misc: jq -e '.active_clients[0].name' 2026-03-31T20:25:18.355 INFO:tasks.workunit.client.0.vm03.stdout:"devicehealth" 2026-03-31T20:25:18.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:782: test_mon_misc: ceph mgr module ls 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:MODULE 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:balancer on (always on) 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:crash on (always on) 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:devicehealth on (always on) 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:orchestrator on (always on) 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:pg_autoscaler on (always on) 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:progress on (always on) 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:rbd_support on (always on) 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:status on (always on) 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:telemetry on (always on) 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:volumes on (always on) 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:iostat on 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:nfs on 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:alerts - 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:cephadm - 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:dashboard - 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:diskprediction_local - 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:influx - 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:insights - 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:localpool - 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:mirroring - 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:osd_perf_query - 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:osd_support - 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:prometheus - 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:rgw - 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:selftest - 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:snap_schedule - 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:stats - 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:telegraf - 2026-03-31T20:25:18.603 INFO:tasks.workunit.client.0.vm03.stdout:test_orchestrator - 2026-03-31T20:25:18.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:783: test_mon_misc: expect_false ceph mgr module enable foodne 2026-03-31T20:25:18.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:25:18.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph mgr module enable foodne 2026-03-31T20:25:18.792 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: all mgr daemons do not support module 'foodne', pass --force to force enablement 2026-03-31T20:25:18.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:25:18.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:784: test_mon_misc: ceph mgr module enable foodne --force 2026-03-31T20:25:20.885 INFO:tasks.ceph.mgr.x.vm03.stderr:/usr/lib/python3/dist-packages/scipy/__init__.py:67: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode. 2026-03-31T20:25:20.885 INFO:tasks.ceph.mgr.x.vm03.stderr:Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve. 2026-03-31T20:25:20.885 INFO:tasks.ceph.mgr.x.vm03.stderr: from numpy import show_config as show_numpy_config 2026-03-31T20:25:20.984 INFO:tasks.workunit.client.0.vm03.stderr:module 'foodne' is already enabled 2026-03-31T20:25:20.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:785: test_mon_misc: ceph mgr module disable foodne 2026-03-31T20:25:21.441 INFO:tasks.ceph.mgr.x.vm03.stderr:Failed to import NVMeoFClient and related components: cannot import name 'NVMeoFClient' from 'dashboard.services.nvmeof_client' (/usr/share/ceph/mgr/dashboard/services/nvmeof_client.py) 2026-03-31T20:25:22.705 INFO:tasks.ceph.mgr.x.vm03.stderr:/usr/lib/python3/dist-packages/scipy/__init__.py:67: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode. 2026-03-31T20:25:22.705 INFO:tasks.ceph.mgr.x.vm03.stderr:Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve. 2026-03-31T20:25:22.705 INFO:tasks.ceph.mgr.x.vm03.stderr: from numpy import show_config as show_numpy_config 2026-03-31T20:25:22.816 INFO:tasks.workunit.client.0.vm03.stderr:module 'foodne' is already disabled 2026-03-31T20:25:22.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:786: test_mon_misc: ceph mgr module disable foodnebizbangbash 2026-03-31T20:25:23.241 INFO:tasks.ceph.mgr.x.vm03.stderr:Failed to import NVMeoFClient and related components: cannot import name 'NVMeoFClient' from 'dashboard.services.nvmeof_client' (/usr/share/ceph/mgr/dashboard/services/nvmeof_client.py) 2026-03-31T20:25:24.623 INFO:tasks.workunit.client.0.vm03.stderr:module 'foodnebizbangbash' is already disabled 2026-03-31T20:25:24.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:788: test_mon_misc: ceph mon metadata a 2026-03-31T20:25:24.912 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:25:24.912 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": "[v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0]", 2026-03-31T20:25:24.912 INFO:tasks.workunit.client.0.vm03.stdout: "arch": "x86_64", 2026-03-31T20:25:24.912 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_release": "tentacle", 2026-03-31T20:25:24.912 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_version": "ceph version 20.2.0-721-g5bb32787 (5bb3278730741031382ca9c3dc9d221a942e06a2) tentacle (stable - None)", 2026-03-31T20:25:24.912 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_version_short": "20.2.0-721-g5bb32787", 2026-03-31T20:25:24.912 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_version_when_created": "ceph version 20.2.0-721-g5bb32787 (5bb3278730741031382ca9c3dc9d221a942e06a2) tentacle (stable - None)", 2026-03-31T20:25:24.912 INFO:tasks.workunit.client.0.vm03.stdout: "compression_algorithms": "none, snappy, zlib, zstd, lz4", 2026-03-31T20:25:24.912 INFO:tasks.workunit.client.0.vm03.stdout: "cpu": "AMD Ryzen 9 7950X3D 16-Core Processor", 2026-03-31T20:25:24.912 INFO:tasks.workunit.client.0.vm03.stdout: "created_at": "2026-03-31T20:21:23.189796Z", 2026-03-31T20:25:24.912 INFO:tasks.workunit.client.0.vm03.stdout: "device_ids": "", 2026-03-31T20:25:24.912 INFO:tasks.workunit.client.0.vm03.stdout: "device_paths": "vda=/dev/disk/by-path/pci-0000:00:05.0", 2026-03-31T20:25:24.912 INFO:tasks.workunit.client.0.vm03.stdout: "devices": "vda", 2026-03-31T20:25:24.912 INFO:tasks.workunit.client.0.vm03.stdout: "distro": "ubuntu", 2026-03-31T20:25:24.912 INFO:tasks.workunit.client.0.vm03.stdout: "distro_description": "Ubuntu 22.04.5 LTS", 2026-03-31T20:25:24.912 INFO:tasks.workunit.client.0.vm03.stdout: "distro_version": "22.04", 2026-03-31T20:25:24.912 INFO:tasks.workunit.client.0.vm03.stdout: "hostname": "vm03", 2026-03-31T20:25:24.913 INFO:tasks.workunit.client.0.vm03.stdout: "kernel_description": "#181-Ubuntu SMP Fri Feb 6 22:44:50 UTC 2026", 2026-03-31T20:25:24.913 INFO:tasks.workunit.client.0.vm03.stdout: "kernel_version": "5.15.0-171-generic", 2026-03-31T20:25:24.913 INFO:tasks.workunit.client.0.vm03.stdout: "mem_swap_kb": "0", 2026-03-31T20:25:24.913 INFO:tasks.workunit.client.0.vm03.stdout: "mem_total_kb": "10179900", 2026-03-31T20:25:24.913 INFO:tasks.workunit.client.0.vm03.stdout: "os": "Linux" 2026-03-31T20:25:24.913 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:25:24.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:789: test_mon_misc: ceph mon metadata 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout:[ 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "name": "a", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": "[v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0]", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "arch": "x86_64", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_release": "tentacle", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_version": "ceph version 20.2.0-721-g5bb32787 (5bb3278730741031382ca9c3dc9d221a942e06a2) tentacle (stable - None)", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_version_short": "20.2.0-721-g5bb32787", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_version_when_created": "ceph version 20.2.0-721-g5bb32787 (5bb3278730741031382ca9c3dc9d221a942e06a2) tentacle (stable - None)", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "compression_algorithms": "none, snappy, zlib, zstd, lz4", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "cpu": "AMD Ryzen 9 7950X3D 16-Core Processor", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "created_at": "2026-03-31T20:21:23.189796Z", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "device_ids": "", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "device_paths": "vda=/dev/disk/by-path/pci-0000:00:05.0", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "devices": "vda", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "distro": "ubuntu", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "distro_description": "Ubuntu 22.04.5 LTS", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "distro_version": "22.04", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "hostname": "vm03", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "kernel_description": "#181-Ubuntu SMP Fri Feb 6 22:44:50 UTC 2026", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "kernel_version": "5.15.0-171-generic", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "mem_swap_kb": "0", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "mem_total_kb": "10179900", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "os": "Linux" 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "name": "b", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": "[v2:192.168.123.103:3301/0,v1:192.168.123.103:6790/0]", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "arch": "x86_64", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_release": "tentacle", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_version": "ceph version 20.2.0-721-g5bb32787 (5bb3278730741031382ca9c3dc9d221a942e06a2) tentacle (stable - None)", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_version_short": "20.2.0-721-g5bb32787", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_version_when_created": "ceph version 20.2.0-721-g5bb32787 (5bb3278730741031382ca9c3dc9d221a942e06a2) tentacle (stable - None)", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "compression_algorithms": "none, snappy, zlib, zstd, lz4", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "cpu": "AMD Ryzen 9 7950X3D 16-Core Processor", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "created_at": "2026-03-31T20:21:23.369952Z", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "device_ids": "", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "device_paths": "vda=/dev/disk/by-path/pci-0000:00:05.0", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "devices": "vda", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "distro": "ubuntu", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "distro_description": "Ubuntu 22.04.5 LTS", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "distro_version": "22.04", 2026-03-31T20:25:25.196 INFO:tasks.workunit.client.0.vm03.stdout: "hostname": "vm03", 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "kernel_description": "#181-Ubuntu SMP Fri Feb 6 22:44:50 UTC 2026", 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "kernel_version": "5.15.0-171-generic", 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "mem_swap_kb": "0", 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "mem_total_kb": "10179900", 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "os": "Linux" 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "name": "c", 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": "[v2:192.168.123.103:3302/0,v1:192.168.123.103:6791/0]", 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "arch": "x86_64", 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_release": "tentacle", 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_version": "ceph version 20.2.0-721-g5bb32787 (5bb3278730741031382ca9c3dc9d221a942e06a2) tentacle (stable - None)", 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_version_short": "20.2.0-721-g5bb32787", 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_version_when_created": "ceph version 20.2.0-721-g5bb32787 (5bb3278730741031382ca9c3dc9d221a942e06a2) tentacle (stable - None)", 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "compression_algorithms": "none, snappy, zlib, zstd, lz4", 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "cpu": "AMD Ryzen 9 7950X3D 16-Core Processor", 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "created_at": "2026-03-31T20:21:23.540831Z", 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "device_ids": "", 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "device_paths": "vda=/dev/disk/by-path/pci-0000:00:05.0", 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "devices": "vda", 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "distro": "ubuntu", 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "distro_description": "Ubuntu 22.04.5 LTS", 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "distro_version": "22.04", 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "hostname": "vm03", 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "kernel_description": "#181-Ubuntu SMP Fri Feb 6 22:44:50 UTC 2026", 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "kernel_version": "5.15.0-171-generic", 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "mem_swap_kb": "0", 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "mem_total_kb": "10179900", 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: "os": "Linux" 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:25.197 INFO:tasks.workunit.client.0.vm03.stdout:] 2026-03-31T20:25:25.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:790: test_mon_misc: ceph mon count-metadata ceph_version 2026-03-31T20:25:25.481 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:25:25.481 INFO:tasks.workunit.client.0.vm03.stdout: "ceph version 20.2.0-721-g5bb32787 (5bb3278730741031382ca9c3dc9d221a942e06a2) tentacle (stable - None)": 3 2026-03-31T20:25:25.481 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:25:25.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:791: test_mon_misc: ceph mon versions 2026-03-31T20:25:25.770 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:25:25.770 INFO:tasks.workunit.client.0.vm03.stdout: "ceph version 20.2.0-721-g5bb32787 (5bb3278730741031382ca9c3dc9d221a942e06a2) tentacle (stable - None)": 3 2026-03-31T20:25:25.770 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:25:25.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:793: test_mon_misc: ceph mgr metadata 2026-03-31T20:25:26.015 INFO:tasks.workunit.client.0.vm03.stdout:[ 2026-03-31T20:25:26.015 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:26.015 INFO:tasks.workunit.client.0.vm03.stdout: "name": "x", 2026-03-31T20:25:26.015 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103", 2026-03-31T20:25:26.015 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": "192.168.123.103:0/832516962", 2026-03-31T20:25:26.015 INFO:tasks.workunit.client.0.vm03.stdout: "arch": "x86_64", 2026-03-31T20:25:26.015 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_release": "tentacle", 2026-03-31T20:25:26.015 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_version": "ceph version 20.2.0-721-g5bb32787 (5bb3278730741031382ca9c3dc9d221a942e06a2) tentacle (stable - None)", 2026-03-31T20:25:26.015 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_version_short": "20.2.0-721-g5bb32787", 2026-03-31T20:25:26.015 INFO:tasks.workunit.client.0.vm03.stdout: "cpu": "AMD Ryzen 9 7950X3D 16-Core Processor", 2026-03-31T20:25:26.015 INFO:tasks.workunit.client.0.vm03.stdout: "distro": "ubuntu", 2026-03-31T20:25:26.015 INFO:tasks.workunit.client.0.vm03.stdout: "distro_description": "Ubuntu 22.04.5 LTS", 2026-03-31T20:25:26.016 INFO:tasks.workunit.client.0.vm03.stdout: "distro_version": "22.04", 2026-03-31T20:25:26.016 INFO:tasks.workunit.client.0.vm03.stdout: "hostname": "vm03", 2026-03-31T20:25:26.016 INFO:tasks.workunit.client.0.vm03.stdout: "kernel_description": "#181-Ubuntu SMP Fri Feb 6 22:44:50 UTC 2026", 2026-03-31T20:25:26.016 INFO:tasks.workunit.client.0.vm03.stdout: "kernel_version": "5.15.0-171-generic", 2026-03-31T20:25:26.016 INFO:tasks.workunit.client.0.vm03.stdout: "mem_swap_kb": "0", 2026-03-31T20:25:26.016 INFO:tasks.workunit.client.0.vm03.stdout: "mem_total_kb": "10179900", 2026-03-31T20:25:26.016 INFO:tasks.workunit.client.0.vm03.stdout: "os": "Linux" 2026-03-31T20:25:26.016 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:26.016 INFO:tasks.workunit.client.0.vm03.stdout:] 2026-03-31T20:25:26.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:794: test_mon_misc: ceph mgr versions 2026-03-31T20:25:26.262 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:25:26.262 INFO:tasks.workunit.client.0.vm03.stdout: "ceph version 20.2.0-721-g5bb32787 (5bb3278730741031382ca9c3dc9d221a942e06a2) tentacle (stable - None)": 1 2026-03-31T20:25:26.262 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:25:26.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:795: test_mon_misc: ceph mgr count-metadata ceph_version 2026-03-31T20:25:26.509 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:25:26.509 INFO:tasks.workunit.client.0.vm03.stdout: "ceph version 20.2.0-721-g5bb32787 (5bb3278730741031382ca9c3dc9d221a942e06a2) tentacle (stable - None)": 1 2026-03-31T20:25:26.509 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:25:26.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:797: test_mon_misc: ceph versions 2026-03-31T20:25:26.774 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:25:26.775 INFO:tasks.workunit.client.0.vm03.stdout: "mon": { 2026-03-31T20:25:26.775 INFO:tasks.workunit.client.0.vm03.stdout: "ceph version 20.2.0-721-g5bb32787 (5bb3278730741031382ca9c3dc9d221a942e06a2) tentacle (stable - None)": 3 2026-03-31T20:25:26.775 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:26.775 INFO:tasks.workunit.client.0.vm03.stdout: "mgr": { 2026-03-31T20:25:26.775 INFO:tasks.workunit.client.0.vm03.stdout: "ceph version 20.2.0-721-g5bb32787 (5bb3278730741031382ca9c3dc9d221a942e06a2) tentacle (stable - None)": 1 2026-03-31T20:25:26.775 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:26.775 INFO:tasks.workunit.client.0.vm03.stdout: "osd": { 2026-03-31T20:25:26.775 INFO:tasks.workunit.client.0.vm03.stdout: "ceph version 20.2.0-721-g5bb32787 (5bb3278730741031382ca9c3dc9d221a942e06a2) tentacle (stable - None)": 3 2026-03-31T20:25:26.775 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:26.775 INFO:tasks.workunit.client.0.vm03.stdout: "overall": { 2026-03-31T20:25:26.775 INFO:tasks.workunit.client.0.vm03.stdout: "ceph version 20.2.0-721-g5bb32787 (5bb3278730741031382ca9c3dc9d221a942e06a2) tentacle (stable - None)": 7 2026-03-31T20:25:26.775 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:26.775 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:25:26.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:799: test_mon_misc: ceph node ls 2026-03-31T20:25:27.055 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:25:27.055 INFO:tasks.workunit.client.0.vm03.stdout: "mon": { 2026-03-31T20:25:27.055 INFO:tasks.workunit.client.0.vm03.stdout: "vm03": [ 2026-03-31T20:25:27.056 INFO:tasks.workunit.client.0.vm03.stdout: "a", 2026-03-31T20:25:27.056 INFO:tasks.workunit.client.0.vm03.stdout: "b", 2026-03-31T20:25:27.056 INFO:tasks.workunit.client.0.vm03.stdout: "c" 2026-03-31T20:25:27.056 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:25:27.056 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:27.056 INFO:tasks.workunit.client.0.vm03.stdout: "osd": { 2026-03-31T20:25:27.056 INFO:tasks.workunit.client.0.vm03.stdout: "vm03": [ 2026-03-31T20:25:27.056 INFO:tasks.workunit.client.0.vm03.stdout: 0, 2026-03-31T20:25:27.056 INFO:tasks.workunit.client.0.vm03.stdout: 1, 2026-03-31T20:25:27.056 INFO:tasks.workunit.client.0.vm03.stdout: 2 2026-03-31T20:25:27.056 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:25:27.056 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:27.056 INFO:tasks.workunit.client.0.vm03.stdout: "mgr": { 2026-03-31T20:25:27.056 INFO:tasks.workunit.client.0.vm03.stdout: "vm03": [ 2026-03-31T20:25:27.056 INFO:tasks.workunit.client.0.vm03.stdout: "x" 2026-03-31T20:25:27.056 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:25:27.056 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:27.056 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:25:27.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:25:27.286 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mon_mon 2026-03-31T20:25:27.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1235: test_mon_mon: ceph --help mon 2026-03-31T20:25:27.397 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:25:27.397 INFO:tasks.workunit.client.0.vm03.stdout: General usage: 2026-03-31T20:25:27.397 INFO:tasks.workunit.client.0.vm03.stdout: ============== 2026-03-31T20:25:27.397 INFO:tasks.workunit.client.0.vm03.stdout:usage: ceph [-h] [-c CEPHCONF] [-i INPUT_FILE] [-o OUTPUT_FILE] 2026-03-31T20:25:27.397 INFO:tasks.workunit.client.0.vm03.stdout: [--setuser SETUSER] [--setgroup SETGROUP] [--id CLIENT_ID] 2026-03-31T20:25:27.397 INFO:tasks.workunit.client.0.vm03.stdout: [--name CLIENT_NAME] [--cluster CLUSTER] 2026-03-31T20:25:27.397 INFO:tasks.workunit.client.0.vm03.stdout: [--admin-daemon ADMIN_SOCKET] [-s] [-w] [--watch-debug] 2026-03-31T20:25:27.397 INFO:tasks.workunit.client.0.vm03.stdout: [--watch-info] [--watch-sec] [--watch-warn] [--watch-error] 2026-03-31T20:25:27.397 INFO:tasks.workunit.client.0.vm03.stdout: [-W WATCH_CHANNEL] [--version] [--verbose] [--concise] 2026-03-31T20:25:27.397 INFO:tasks.workunit.client.0.vm03.stdout: [--daemon-output-file DAEMON_OUTPUT_FILE] 2026-03-31T20:25:27.397 INFO:tasks.workunit.client.0.vm03.stdout: [-f {json,json-pretty,xml,xml-pretty,plain,yaml}] 2026-03-31T20:25:27.397 INFO:tasks.workunit.client.0.vm03.stdout: [--connect-timeout CLUSTER_TIMEOUT] [--block] [--period PERIOD] 2026-03-31T20:25:27.397 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:25:27.397 INFO:tasks.workunit.client.0.vm03.stdout:Ceph administration tool 2026-03-31T20:25:27.397 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:25:27.397 INFO:tasks.workunit.client.0.vm03.stdout:options: 2026-03-31T20:25:27.397 INFO:tasks.workunit.client.0.vm03.stdout: -h, --help request mon help 2026-03-31T20:25:27.397 INFO:tasks.workunit.client.0.vm03.stdout: -c CEPHCONF, --conf CEPHCONF 2026-03-31T20:25:27.397 INFO:tasks.workunit.client.0.vm03.stdout: ceph configuration file 2026-03-31T20:25:27.397 INFO:tasks.workunit.client.0.vm03.stdout: -i INPUT_FILE, --in-file INPUT_FILE 2026-03-31T20:25:27.397 INFO:tasks.workunit.client.0.vm03.stdout: input file, or "-" for stdin 2026-03-31T20:25:27.397 INFO:tasks.workunit.client.0.vm03.stdout: -o OUTPUT_FILE, --out-file OUTPUT_FILE 2026-03-31T20:25:27.397 INFO:tasks.workunit.client.0.vm03.stdout: output file, or "-" for stdout 2026-03-31T20:25:27.397 INFO:tasks.workunit.client.0.vm03.stdout: --setuser SETUSER set user file permission 2026-03-31T20:25:27.397 INFO:tasks.workunit.client.0.vm03.stdout: --setgroup SETGROUP set group file permission 2026-03-31T20:25:27.397 INFO:tasks.workunit.client.0.vm03.stdout: --id CLIENT_ID, --user CLIENT_ID 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: client id for authentication 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: --name CLIENT_NAME, -n CLIENT_NAME 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: client name for authentication 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: --cluster CLUSTER cluster name 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: --admin-daemon ADMIN_SOCKET 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: submit admin-socket command (e.g. "help" fora list of 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: available commands) 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: -s, --status show cluster status 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: -w, --watch watch live cluster changes 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: --watch-debug watch debug events 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: --watch-info watch info events 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: --watch-sec watch security events 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: --watch-warn watch warn events 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: --watch-error watch error events 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: -W WATCH_CHANNEL, --watch-channel WATCH_CHANNEL 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: watch live cluster changes on a specific channel 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: (e.g., cluster, audit, cephadm, or '*' for all) 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: --version, -v display version 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: --verbose make verbose 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: --concise make less verbose 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: --daemon-output-file DAEMON_OUTPUT_FILE 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: output file location local to the daemon for JSON 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: produced by tell commands 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: -f {json,json-pretty,xml,xml-pretty,plain,yaml}, --format {json,json-pretty,xml,xml-pretty,plain,yaml} 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: Note: yaml is only valid for orch commands 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: --connect-timeout CLUSTER_TIMEOUT 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: set a timeout for connecting to the cluster 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: --block block until completion (scrub and deep-scrub only) 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: --period PERIOD, -p PERIOD 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: polling period, default 1.0 second (for polling 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: commands only) 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: Local commands: 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: =============== 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout:ping Send simple presence/life test to a mon 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout: may be 'mon.*' for all mons 2026-03-31T20:25:27.398 INFO:tasks.workunit.client.0.vm03.stdout:daemon {type.id|path} 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout: Same as --admin-daemon, but auto-find admin socket 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout:daemonperf {type.id | path} [stat-pats] [priority] [] [] 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout:daemonperf {type.id | path} list|ls [stat-pats] [priority] 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout: Get selected perf stats from daemon/admin socket 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout: Optional shell-glob comma-delim match string stat-pats 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout: Optional selection priority (can abbreviate name): 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout: critical, interesting, useful, noninteresting, debug 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout: List shows a table of all available stats 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout: Run times (default forever), 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout: once per seconds (default 1) 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout: Monitor commands: 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout: ================= 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout:mon add [...] add new monitor named at , 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout: possibly with CRUSH location 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout:mon add disallowed_leader prevent the named mon from being a 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout: leader 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout:mon count-metadata count mons by metadata field property 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout:mon disable_stretch_mode [] [--yes-i-really-mean-it] normal peering rules 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout:mon dump [] dump formatted monmap (optionally from 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout: epoch) 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout:mon enable-msgr2 enable the msgr2 protocol on port 3300 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout:mon enable_stretch_mode all pools with as the 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout: tiebreaker and setting locations as the units for 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout: stretching across 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout:mon feature ls [--with-value] list available mon map features to be 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout: set/unset 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout:mon feature set [--yes- set provided feature on mon map 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout: i-really-mean-it] 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout:mon getmap [] get monmap 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout:mon metadata [] fetch metadata for mon 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout:mon ok-to-add-offline check whether adding a mon and not 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout: starting it would break quorum 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout:mon ok-to-rm check whether removing the specified 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout: mon would break quorum 2026-03-31T20:25:27.399 INFO:tasks.workunit.client.0.vm03.stdout:mon ok-to-stop ... check whether mon(s) can be safely 2026-03-31T20:25:27.400 INFO:tasks.workunit.client.0.vm03.stdout: stopped without reducing immediate 2026-03-31T20:25:27.400 INFO:tasks.workunit.client.0.vm03.stdout: availability 2026-03-31T20:25:27.400 INFO:tasks.workunit.client.0.vm03.stdout:mon rm remove monitor named 2026-03-31T20:25:27.400 INFO:tasks.workunit.client.0.vm03.stdout:mon rm disallowed_leader allow the named mon to be a leader again 2026-03-31T20:25:27.400 INFO:tasks.workunit.client.0.vm03.stdout:mon scrub scrub the monitor stores 2026-03-31T20:25:27.400 INFO:tasks.workunit.client.0.vm03.stdout:mon set election_strategy set the election strategy to use; 2026-03-31T20:25:27.400 INFO:tasks.workunit.client.0.vm03.stdout: choices classic, disallow, connectivity 2026-03-31T20:25:27.400 INFO:tasks.workunit.client.0.vm03.stdout:mon set-addrs set the addrs (IPs and ports) a 2026-03-31T20:25:27.400 INFO:tasks.workunit.client.0.vm03.stdout: specific monitor binds to 2026-03-31T20:25:27.400 INFO:tasks.workunit.client.0.vm03.stdout:mon set-rank set the rank for the specified mon 2026-03-31T20:25:27.400 INFO:tasks.workunit.client.0.vm03.stdout:mon set-weight set the weight for the specified mon 2026-03-31T20:25:27.400 INFO:tasks.workunit.client.0.vm03.stdout:mon set_location ... specify location for the monitor 2026-03-31T20:25:27.400 INFO:tasks.workunit.client.0.vm03.stdout: , using CRUSH bucket names 2026-03-31T20:25:27.400 INFO:tasks.workunit.client.0.vm03.stdout:mon set_new_tiebreaker [--yes-i- switch the stretch tiebreaker to be the 2026-03-31T20:25:27.400 INFO:tasks.workunit.client.0.vm03.stdout: really-mean-it] named mon 2026-03-31T20:25:27.400 INFO:tasks.workunit.client.0.vm03.stdout:mon stat summarize monitor status 2026-03-31T20:25:27.400 INFO:tasks.workunit.client.0.vm03.stdout:mon versions check running versions of monitors 2026-03-31T20:25:27.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1237: test_mon_mon: ceph osd dump -h 2026-03-31T20:25:27.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1237: test_mon_mon: grep 'osd dump' 2026-03-31T20:25:27.526 INFO:tasks.workunit.client.0.vm03.stdout:osd dump [] print summary of OSD map 2026-03-31T20:25:27.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1238: test_mon_mon: ceph osd dump 123 -h 2026-03-31T20:25:27.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1238: test_mon_mon: grep 'osd dump' 2026-03-31T20:25:27.682 INFO:tasks.workunit.client.0.vm03.stdout:osd dump [] print summary of OSD map 2026-03-31T20:25:27.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1240: test_mon_mon: ceph mon dump 2026-03-31T20:25:27.952 INFO:tasks.workunit.client.0.vm03.stdout:epoch 1 2026-03-31T20:25:27.952 INFO:tasks.workunit.client.0.vm03.stdout:fsid a4a0ca01-ae82-443e-a7c7-50605716689a 2026-03-31T20:25:27.952 INFO:tasks.workunit.client.0.vm03.stdout:last_changed 2026-03-31T20:21:18.374590+0000 2026-03-31T20:25:27.952 INFO:tasks.workunit.client.0.vm03.stdout:created 2026-03-31T20:21:18.374590+0000 2026-03-31T20:25:27.952 INFO:tasks.workunit.client.0.vm03.stdout:min_mon_release 20 (tentacle) 2026-03-31T20:25:27.952 INFO:tasks.workunit.client.0.vm03.stdout:election_strategy: 3 2026-03-31T20:25:27.952 INFO:tasks.workunit.client.0.vm03.stdout:0: [v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0] mon.a 2026-03-31T20:25:27.952 INFO:tasks.workunit.client.0.vm03.stdout:1: [v2:192.168.123.103:3301/0,v1:192.168.123.103:6790/0] mon.b 2026-03-31T20:25:27.952 INFO:tasks.workunit.client.0.vm03.stdout:2: [v2:192.168.123.103:3302/0,v1:192.168.123.103:6791/0] mon.c 2026-03-31T20:25:27.952 INFO:tasks.workunit.client.0.vm03.stderr:dumped monmap epoch 1 2026-03-31T20:25:27.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1241: test_mon_mon: ceph mon getmap -o /tmp/cephtool.sYl/monmap.26274 2026-03-31T20:25:28.232 INFO:tasks.workunit.client.0.vm03.stderr:got monmap epoch 1 2026-03-31T20:25:28.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1242: test_mon_mon: '[' -s /tmp/cephtool.sYl/monmap.26274 ']' 2026-03-31T20:25:28.244 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1245: test_mon_mon: ceph mon dump -f json 2026-03-31T20:25:28.244 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1245: test_mon_mon: jq -r '.mons[0].name' 2026-03-31T20:25:28.509 INFO:tasks.workunit.client.0.vm03.stderr:dumped monmap epoch 1 2026-03-31T20:25:28.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1245: test_mon_mon: first=a 2026-03-31T20:25:28.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1246: test_mon_mon: ceph tell mon.a mon_status 2026-03-31T20:25:28.591 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "name": "a", 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 0, 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "state": "leader", 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "election_epoch": 4, 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "uptime": 244844, 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "quorum": [ 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: 0, 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: 1, 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: 2 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "quorum_age": 244, 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "features": { 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "required_con": "2451647607914708996", 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "required_mon": [ 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "kraken", 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "luminous", 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "mimic", 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "osdmap-prune", 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "nautilus", 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "octopus", 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "pacific", 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "elector-pinging", 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "quincy", 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "reef", 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "squid", 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "tentacle" 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "quorum_con": "4541880224203014143", 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "quorum_mon": [ 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "kraken", 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "luminous", 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "mimic", 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "osdmap-prune", 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "nautilus", 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "octopus", 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "pacific", 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "elector-pinging", 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "quincy", 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "reef", 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "squid", 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: "tentacle" 2026-03-31T20:25:28.592 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "outside_quorum": [], 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "extra_probe_peers": [], 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "sync_provider": [], 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "monmap": { 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "epoch": 1, 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "fsid": "a4a0ca01-ae82-443e-a7c7-50605716689a", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "modified": "2026-03-31T20:21:18.374590Z", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "created": "2026-03-31T20:21:18.374590Z", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "min_mon_release": 20, 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "min_mon_release_name": "tentacle", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "election_strategy": 3, 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "disallowed_leaders": "", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "stretch_mode": false, 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "tiebreaker_mon": "", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "removed_ranks": "", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "features": { 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "persistent": [ 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "kraken", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "luminous", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "mimic", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "osdmap-prune", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "nautilus", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "octopus", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "pacific", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "elector-pinging", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "quincy", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "reef", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "squid", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "tentacle" 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "optional": [] 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "mons": [ 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 0, 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "name": "a", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "public_addrs": { 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3300", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6789", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6789/0", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "public_addr": "192.168.123.103:6789/0", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "priority": 0, 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "weight": 0, 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": "{}" 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 1, 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "name": "b", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "public_addrs": { 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3301", 2026-03-31T20:25:28.593 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6790", 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6790/0", 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "public_addr": "192.168.123.103:6790/0", 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "priority": 0, 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "weight": 0, 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": "{}" 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 2, 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "name": "c", 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "public_addrs": { 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3302", 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6791", 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6791/0", 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "public_addr": "192.168.123.103:6791/0", 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "priority": 0, 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "weight": 0, 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": "{}" 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "feature_map": { 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "mon": [ 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "features": "0x3f07fffffffdffff", 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "release": "squid", 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "num": 1 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "osd": [ 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "features": "0x3f07fffffffdffff", 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "release": "squid", 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "num": 1 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "client": [ 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "features": "0x3f07fffffffdffff", 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "release": "squid", 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "num": 4 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout: "stretch_mode": false 2026-03-31T20:25:28.594 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:25:28.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1249: test_mon_mon: ceph mon feature ls 2026-03-31T20:25:28.888 INFO:tasks.workunit.client.0.vm03.stdout:all features 2026-03-31T20:25:28.888 INFO:tasks.workunit.client.0.vm03.stdout: supported: [kraken,luminous,mimic,osdmap-prune,nautilus,octopus,pacific,elector-pinging,quincy,reef,squid,tentacle] 2026-03-31T20:25:28.888 INFO:tasks.workunit.client.0.vm03.stdout: persistent: [kraken,luminous,mimic,osdmap-prune,nautilus,octopus,pacific,elector-pinging,quincy,reef,squid,tentacle] 2026-03-31T20:25:28.888 INFO:tasks.workunit.client.0.vm03.stdout:on current monmap (epoch 1) 2026-03-31T20:25:28.888 INFO:tasks.workunit.client.0.vm03.stdout: persistent: [kraken,luminous,mimic,osdmap-prune,nautilus,octopus,pacific,elector-pinging,quincy,reef,squid,tentacle] 2026-03-31T20:25:28.888 INFO:tasks.workunit.client.0.vm03.stdout: required: [kraken,luminous,mimic,osdmap-prune,nautilus,octopus,pacific,elector-pinging,quincy,reef,squid,tentacle] 2026-03-31T20:25:28.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1250: test_mon_mon: ceph mon feature set kraken --yes-i-really-mean-it 2026-03-31T20:25:29.164 INFO:tasks.workunit.client.0.vm03.stderr:setting feature '[kraken]' feature '[kraken]' already set on monmap 2026-03-31T20:25:29.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1251: test_mon_mon: expect_false ceph mon feature set abcd 2026-03-31T20:25:29.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:25:29.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph mon feature set abcd 2026-03-31T20:25:29.359 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: unknown feature 'abcd' 2026-03-31T20:25:29.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:25:29.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1252: test_mon_mon: expect_false ceph mon feature set abcd --yes-i-really-mean-it 2026-03-31T20:25:29.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:25:29.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph mon feature set abcd --yes-i-really-mean-it 2026-03-31T20:25:29.548 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: unknown feature 'abcd' 2026-03-31T20:25:29.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:25:29.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1255: test_mon_mon: expect_failure /tmp/cephtool.sYl ceph mon add disallowed_leader a 2026-03-31T20:25:29.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2021: expect_failure: local dir=/tmp/cephtool.sYl 2026-03-31T20:25:29.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2022: expect_failure: shift 2026-03-31T20:25:29.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2023: expect_failure: local expected=ceph 2026-03-31T20:25:29.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2024: expect_failure: shift 2026-03-31T20:25:29.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2025: expect_failure: local success 2026-03-31T20:25:29.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2027: expect_failure: mon add disallowed_leader a 2026-03-31T20:25:29.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2030: expect_failure: success=false 2026-03-31T20:25:29.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2033: expect_failure: false 2026-03-31T20:25:29.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2033: expect_failure: grep --quiet ceph /tmp/cephtool.sYl/out 2026-03-31T20:25:29.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2037: expect_failure: return 0 2026-03-31T20:25:29.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1256: test_mon_mon: ceph mon set election_strategy disallow 2026-03-31T20:25:29.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1257: test_mon_mon: ceph mon add disallowed_leader a 2026-03-31T20:25:35.062 INFO:tasks.workunit.client.0.vm03.stderr:mon.a is already disallowed 2026-03-31T20:25:35.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1258: test_mon_mon: ceph mon set election_strategy connectivity 2026-03-31T20:25:35.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1259: test_mon_mon: ceph mon rm disallowed_leader a 2026-03-31T20:25:41.573 INFO:tasks.workunit.client.0.vm03.stderr:mon.a is already allowed 2026-03-31T20:25:41.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1260: test_mon_mon: ceph mon set election_strategy classic 2026-03-31T20:25:46.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1261: test_mon_mon: expect_failure /tmp/cephtool.sYl ceph mon rm disallowed_leader a 2026-03-31T20:25:46.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2021: expect_failure: local dir=/tmp/cephtool.sYl 2026-03-31T20:25:46.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2022: expect_failure: shift 2026-03-31T20:25:46.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2023: expect_failure: local expected=ceph 2026-03-31T20:25:46.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2024: expect_failure: shift 2026-03-31T20:25:46.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2025: expect_failure: local success 2026-03-31T20:25:46.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2027: expect_failure: mon rm disallowed_leader a 2026-03-31T20:25:46.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2030: expect_failure: success=false 2026-03-31T20:25:46.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2033: expect_failure: false 2026-03-31T20:25:46.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2033: expect_failure: grep --quiet ceph /tmp/cephtool.sYl/out 2026-03-31T20:25:46.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2037: expect_failure: return 0 2026-03-31T20:25:46.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1265: test_mon_mon: ceph mon stat 2026-03-31T20:25:47.088 INFO:tasks.workunit.client.0.vm03.stdout:e6: 3 mons at {a=[v2:192.168.123.103:3300/0,v1:192.168.123.103:6789/0],b=[v2:192.168.123.103:3301/0,v1:192.168.123.103:6790/0],c=[v2:192.168.123.103:3302/0,v1:192.168.123.103:6791/0]} removed_ranks: {} disallowed_leaders: {}, election epoch 30, leader 0 a, quorum 0,1,2 a,b,c 2026-03-31T20:25:47.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1266: test_mon_mon: ceph mon stat -f json 2026-03-31T20:25:47.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1266: test_mon_mon: jq . 2026-03-31T20:25:47.395 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:25:47.395 INFO:tasks.workunit.client.0.vm03.stdout: "epoch": 6, 2026-03-31T20:25:47.395 INFO:tasks.workunit.client.0.vm03.stdout: "min_mon_release_name": "tentacle", 2026-03-31T20:25:47.395 INFO:tasks.workunit.client.0.vm03.stdout: "num_mons": 3, 2026-03-31T20:25:47.395 INFO:tasks.workunit.client.0.vm03.stdout: "leader": "a", 2026-03-31T20:25:47.395 INFO:tasks.workunit.client.0.vm03.stdout: "quorum": [ 2026-03-31T20:25:47.395 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:47.395 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 0, 2026-03-31T20:25:47.395 INFO:tasks.workunit.client.0.vm03.stdout: "name": "a" 2026-03-31T20:25:47.395 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:47.395 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:47.395 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 1, 2026-03-31T20:25:47.395 INFO:tasks.workunit.client.0.vm03.stdout: "name": "b" 2026-03-31T20:25:47.396 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:25:47.396 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:25:47.396 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 2, 2026-03-31T20:25:47.396 INFO:tasks.workunit.client.0.vm03.stdout: "name": "c" 2026-03-31T20:25:47.396 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:25:47.396 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:25:47.396 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:25:47.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:25:47.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mon_osd 2026-03-31T20:25:47.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1490: test_mon_osd: bl=192.168.0.1:0/1000 2026-03-31T20:25:47.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1491: test_mon_osd: ceph osd blocklist add 192.168.0.1:0/1000 2026-03-31T20:25:48.862 INFO:tasks.workunit.client.0.vm03.stderr:blocklisting 192.168.0.1:0/1000 until 2026-03-31T21:25:47.907585+0000 (3600 sec) 2026-03-31T20:25:48.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1492: test_mon_osd: ceph osd blocklist ls 2026-03-31T20:25:48.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1492: test_mon_osd: grep 192.168.0.1:0/1000 2026-03-31T20:25:49.094 INFO:tasks.workunit.client.0.vm03.stderr:listed 6 entries 2026-03-31T20:25:49.107 INFO:tasks.workunit.client.0.vm03.stdout:192.168.0.1:0/1000 2026-03-31T21:25:47.777772+0000 2026-03-31T20:25:49.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1493: test_mon_osd: ceph osd blocklist ls --format=json-pretty 2026-03-31T20:25:49.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1493: test_mon_osd: sed 's/\\\//\//' 2026-03-31T20:25:49.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1493: test_mon_osd: grep 192.168.0.1:0/1000 2026-03-31T20:25:49.327 INFO:tasks.workunit.client.0.vm03.stderr:listed 6 entries 2026-03-31T20:25:49.340 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.0.1:0/1000", 2026-03-31T20:25:49.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1494: test_mon_osd: ceph osd dump --format=json-pretty 2026-03-31T20:25:49.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1494: test_mon_osd: grep 192.168.0.1:0/1000 2026-03-31T20:25:49.560 INFO:tasks.workunit.client.0.vm03.stdout: "192.168.0.1:0/1000": "2026-03-31T21:25:47.777772+0000", 2026-03-31T20:25:49.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1495: test_mon_osd: ceph osd dump 2026-03-31T20:25:49.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1495: test_mon_osd: grep 192.168.0.1:0/1000 2026-03-31T20:25:49.777 INFO:tasks.workunit.client.0.vm03.stdout:blocklist 192.168.0.1:0/1000 expires 2026-03-31T21:25:47.777772+0000 2026-03-31T20:25:49.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1496: test_mon_osd: ceph osd blocklist rm 192.168.0.1:0/1000 2026-03-31T20:25:50.934 INFO:tasks.workunit.client.0.vm03.stderr:192.168.0.1:0/1000 isn't blocklisted 2026-03-31T20:25:50.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1497: test_mon_osd: ceph osd blocklist ls 2026-03-31T20:25:50.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1497: test_mon_osd: expect_false grep 192.168.0.1:0/1000 2026-03-31T20:25:50.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:25:50.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep 192.168.0.1:0/1000 2026-03-31T20:25:51.175 INFO:tasks.workunit.client.0.vm03.stderr:listed 5 entries 2026-03-31T20:25:51.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:25:51.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1499: test_mon_osd: bl=192.168.0.1 2026-03-31T20:25:51.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1501: test_mon_osd: ceph osd blocklist add 192.168.0.1 2026-03-31T20:25:52.892 INFO:tasks.workunit.client.0.vm03.stderr:blocklisting 192.168.0.1:0/0 until 2026-03-31T21:25:51.939843+0000 (3600 sec) 2026-03-31T20:25:52.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1502: test_mon_osd: ceph osd blocklist ls 2026-03-31T20:25:52.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1502: test_mon_osd: grep 192.168.0.1 2026-03-31T20:25:53.127 INFO:tasks.workunit.client.0.vm03.stderr:listed 6 entries 2026-03-31T20:25:53.139 INFO:tasks.workunit.client.0.vm03.stdout:192.168.0.1:0/0 2026-03-31T21:25:51.347721+0000 2026-03-31T20:25:53.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1503: test_mon_osd: ceph osd blocklist rm 192.168.0.1 2026-03-31T20:25:53.959 INFO:tasks.workunit.client.0.vm03.stderr:192.168.0.1:0/0 isn't blocklisted 2026-03-31T20:25:53.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1504: test_mon_osd: ceph osd blocklist ls 2026-03-31T20:25:53.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1504: test_mon_osd: expect_false grep 192.168.0.1 2026-03-31T20:25:53.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:25:53.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep 192.168.0.1 2026-03-31T20:25:54.190 INFO:tasks.workunit.client.0.vm03.stderr:listed 5 entries 2026-03-31T20:25:54.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:25:54.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1505: test_mon_osd: expect_false 'ceph osd blocklist add 192.168.0.1/-1' 2026-03-31T20:25:54.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:25:54.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: 'ceph osd blocklist add 192.168.0.1/-1' 2026-03-31T20:25:54.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh: line 36: ceph osd blocklist add 192.168.0.1/-1: No such file or directory 2026-03-31T20:25:54.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:25:54.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1506: test_mon_osd: expect_false 'ceph osd blocklist add 192.168.0.1/foo' 2026-03-31T20:25:54.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:25:54.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: 'ceph osd blocklist add 192.168.0.1/foo' 2026-03-31T20:25:54.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh: line 36: ceph osd blocklist add 192.168.0.1/foo: No such file or directory 2026-03-31T20:25:54.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:25:54.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1509: test_mon_osd: expect_false 'ceph osd blocklist add 1234.56.78.90/100' 2026-03-31T20:25:54.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:25:54.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: 'ceph osd blocklist add 1234.56.78.90/100' 2026-03-31T20:25:54.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh: line 36: ceph osd blocklist add 1234.56.78.90/100: No such file or directory 2026-03-31T20:25:54.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:25:54.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1512: test_mon_osd: bl=192.168.0.1:0/24 2026-03-31T20:25:54.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1513: test_mon_osd: ceph osd blocklist range add 192.168.0.1:0/24 2026-03-31T20:25:55.919 INFO:tasks.workunit.client.0.vm03.stderr:blocklisting cidr:192.168.0.1:0/24 until 2026-03-31T21:25:54.961341+0000 (3600 sec) 2026-03-31T20:25:55.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1514: test_mon_osd: ceph osd blocklist ls 2026-03-31T20:25:55.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1514: test_mon_osd: grep 192.168.0.1:0/24 2026-03-31T20:25:56.144 INFO:tasks.workunit.client.0.vm03.stderr:listed 6 entries 2026-03-31T20:25:56.157 INFO:tasks.workunit.client.0.vm03.stdout:cidr:192.168.0.1:0/24 2026-03-31T21:25:54.361196+0000 2026-03-31T20:25:56.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1515: test_mon_osd: ceph osd blocklist range rm 192.168.0.1:0/24 2026-03-31T20:25:56.978 INFO:tasks.workunit.client.0.vm03.stderr:cidr:192.168.0.1:0/24 isn't blocklisted 2026-03-31T20:25:56.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1516: test_mon_osd: ceph osd blocklist ls 2026-03-31T20:25:56.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1516: test_mon_osd: expect_false grep 192.168.0.1:0/24 2026-03-31T20:25:56.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:25:56.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep 192.168.0.1:0/24 2026-03-31T20:25:57.203 INFO:tasks.workunit.client.0.vm03.stderr:listed 5 entries 2026-03-31T20:25:57.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:25:57.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1517: test_mon_osd: bad_bl=192.168.0.1/33 2026-03-31T20:25:57.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1518: test_mon_osd: expect_false ceph osd blocklist range add 192.168.0.1/33 2026-03-31T20:25:57.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:25:57.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd blocklist range add 192.168.0.1/33 2026-03-31T20:25:57.383 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: Too many bits in range for that protocol! 2026-03-31T20:25:57.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:25:57.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1521: test_mon_osd: ceph osd blocklist add 192.168.0.1:0/24 2026-03-31T20:25:58.936 INFO:tasks.workunit.client.0.vm03.stderr:blocklisting 192.168.0.1:0/24 until 2026-03-31T21:25:57.996250+0000 (3600 sec) 2026-03-31T20:25:58.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1522: test_mon_osd: ceph osd blocklist ls 2026-03-31T20:25:58.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1522: test_mon_osd: grep 192.168.0.1:0/24 2026-03-31T20:25:59.158 INFO:tasks.workunit.client.0.vm03.stderr:listed 6 entries 2026-03-31T20:25:59.172 INFO:tasks.workunit.client.0.vm03.stdout:192.168.0.1:0/24 2026-03-31T21:25:57.548818+0000 2026-03-31T20:25:59.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1523: test_mon_osd: ceph osd blocklist clear 2026-03-31T20:26:00.952 INFO:tasks.workunit.client.0.vm03.stderr: removed all blocklist entries 2026-03-31T20:26:00.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1524: test_mon_osd: ceph osd blocklist ls 2026-03-31T20:26:00.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1524: test_mon_osd: expect_false grep 192.168.0.1:0/24 2026-03-31T20:26:00.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:26:00.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep 192.168.0.1:0/24 2026-03-31T20:26:01.186 INFO:tasks.workunit.client.0.vm03.stderr:listed 0 entries 2026-03-31T20:26:01.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:26:01.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1527: test_mon_osd: ceph osd blacklist ls 2026-03-31T20:26:01.411 INFO:tasks.workunit.client.0.vm03.stderr:listed 0 entries 2026-03-31T20:26:01.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1532: test_mon_osd: ceph osd crush reweight-all 2026-03-31T20:26:02.969 INFO:tasks.workunit.client.0.vm03.stderr:reweighted crush hierarchy 2026-03-31T20:26:02.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1533: test_mon_osd: ceph osd crush tunables legacy 2026-03-31T20:26:04.981 INFO:tasks.workunit.client.0.vm03.stderr:adjusted tunables profile to legacy 2026-03-31T20:26:04.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1534: test_mon_osd: ceph osd crush show-tunables 2026-03-31T20:26:04.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1534: test_mon_osd: grep argonaut 2026-03-31T20:26:05.222 INFO:tasks.workunit.client.0.vm03.stdout: "profile": "argonaut", 2026-03-31T20:26:05.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1535: test_mon_osd: ceph osd crush tunables bobtail 2026-03-31T20:26:07.003 INFO:tasks.workunit.client.0.vm03.stderr:adjusted tunables profile to bobtail 2026-03-31T20:26:07.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1536: test_mon_osd: ceph osd crush show-tunables 2026-03-31T20:26:07.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1536: test_mon_osd: grep bobtail 2026-03-31T20:26:07.242 INFO:tasks.workunit.client.0.vm03.stdout: "profile": "bobtail", 2026-03-31T20:26:07.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1537: test_mon_osd: ceph osd crush tunables firefly 2026-03-31T20:26:09.018 INFO:tasks.workunit.client.0.vm03.stderr:adjusted tunables profile to firefly 2026-03-31T20:26:09.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1538: test_mon_osd: ceph osd crush show-tunables 2026-03-31T20:26:09.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1538: test_mon_osd: grep firefly 2026-03-31T20:26:09.251 INFO:tasks.workunit.client.0.vm03.stdout: "profile": "firefly", 2026-03-31T20:26:09.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1540: test_mon_osd: ceph osd crush set-tunable straw_calc_version 0 2026-03-31T20:26:11.033 INFO:tasks.workunit.client.0.vm03.stderr:adjusted tunable straw_calc_version to 0 2026-03-31T20:26:11.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1541: test_mon_osd: ceph osd crush get-tunable straw_calc_version 2026-03-31T20:26:11.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1541: test_mon_osd: grep 0 2026-03-31T20:26:11.257 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-31T20:26:11.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1542: test_mon_osd: ceph osd crush set-tunable straw_calc_version 1 2026-03-31T20:26:13.043 INFO:tasks.workunit.client.0.vm03.stderr:adjusted tunable straw_calc_version to 1 2026-03-31T20:26:13.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1543: test_mon_osd: ceph osd crush get-tunable straw_calc_version 2026-03-31T20:26:13.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1543: test_mon_osd: grep 1 2026-03-31T20:26:13.270 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-31T20:26:13.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1547: test_mon_osd: expect_false ceph osd set-require-min-compat-client dumpling 2026-03-31T20:26:13.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:26:13.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd set-require-min-compat-client dumpling 2026-03-31T20:26:13.419 INFO:tasks.workunit.client.0.vm03.stderr:Error EPERM: osdmap current utilizes features that require hammer; cannot set require_min_compat_client below that to dumpling 2026-03-31T20:26:13.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:26:13.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1548: test_mon_osd: ceph osd get-require-min-compat-client 2026-03-31T20:26:13.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1548: test_mon_osd: grep luminous 2026-03-31T20:26:13.656 INFO:tasks.workunit.client.0.vm03.stdout:luminous 2026-03-31T20:26:13.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1549: test_mon_osd: ceph osd dump 2026-03-31T20:26:13.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1549: test_mon_osd: grep 'require_min_compat_client luminous' 2026-03-31T20:26:13.893 INFO:tasks.workunit.client.0.vm03.stdout:require_min_compat_client luminous 2026-03-31T20:26:13.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1556: test_mon_osd: ceph osd scrub 0 --block 2026-03-31T20:26:14.350 INFO:tasks.workunit.client.0.vm03.stderr:instructed osd(s) 0 to scrub 2026-03-31T20:26:21.005 INFO:tasks.workunit.client.0.vm03.stdout:Waiting for scrub to complete... 2026-03-31T20:26:21.005 INFO:tasks.workunit.client.0.vm03.stdout:scrub completed 2026-03-31T20:26:21.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1557: test_mon_osd: ceph osd deep-scrub 0 --block 2026-03-31T20:26:21.413 INFO:tasks.workunit.client.0.vm03.stderr:instructed osd(s) 0 to deep-scrub 2026-03-31T20:26:28.042 INFO:tasks.workunit.client.0.vm03.stdout:Waiting for deep-scrub to complete... 2026-03-31T20:26:28.042 INFO:tasks.workunit.client.0.vm03.stdout:deep-scrub completed 2026-03-31T20:26:28.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1560: test_mon_osd: ceph osd scrub 0 2026-03-31T20:26:28.245 INFO:tasks.workunit.client.0.vm03.stderr:instructed osd(s) 0 to scrub 2026-03-31T20:26:28.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1561: test_mon_osd: ceph osd deep-scrub 0 2026-03-31T20:26:28.446 INFO:tasks.workunit.client.0.vm03.stderr:instructed osd(s) 0 to deep-scrub 2026-03-31T20:26:28.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1562: test_mon_osd: ceph osd repair 0 2026-03-31T20:26:28.650 INFO:tasks.workunit.client.0.vm03.stderr:instructed osd(s) 0 to repair 2026-03-31T20:26:28.661 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1565: test_mon_osd: rados lspools 2026-03-31T20:26:28.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1565: test_mon_osd: pool_names='.mgr 2026-03-31T20:26:28.684 INFO:tasks.workunit.client.0.vm03.stderr:rbd' 2026-03-31T20:26:28.685 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1566: test_mon_osd: for pool_name in $pool_names 2026-03-31T20:26:28.685 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1568: test_mon_osd: ceph osd pool scrub .mgr 2026-03-31T20:26:28.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1569: test_mon_osd: ceph osd pool deep-scrub .mgr 2026-03-31T20:26:29.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1570: test_mon_osd: ceph osd pool repair .mgr 2026-03-31T20:26:29.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1571: test_mon_osd: ceph osd pool force-recovery .mgr 2026-03-31T20:26:29.489 INFO:tasks.workunit.client.0.vm03.stderr:pg 1.0 doesn't require recovery; 2026-03-31T20:26:29.489 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:26:29.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1572: test_mon_osd: ceph osd pool cancel-force-recovery .mgr 2026-03-31T20:26:29.690 INFO:tasks.workunit.client.0.vm03.stderr:pg 1.0 recovery not forced; 2026-03-31T20:26:29.690 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:26:29.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1573: test_mon_osd: ceph osd pool force-backfill .mgr 2026-03-31T20:26:29.903 INFO:tasks.workunit.client.0.vm03.stderr:pg 1.0 doesn't require backfilling; 2026-03-31T20:26:29.903 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:26:29.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1574: test_mon_osd: ceph osd pool cancel-force-backfill .mgr 2026-03-31T20:26:30.105 INFO:tasks.workunit.client.0.vm03.stderr:pg 1.0 backfill not forced; 2026-03-31T20:26:30.105 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:26:30.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1566: test_mon_osd: for pool_name in $pool_names 2026-03-31T20:26:30.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1568: test_mon_osd: ceph osd pool scrub rbd 2026-03-31T20:26:30.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1569: test_mon_osd: ceph osd pool deep-scrub rbd 2026-03-31T20:26:30.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1570: test_mon_osd: ceph osd pool repair rbd 2026-03-31T20:26:30.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1571: test_mon_osd: ceph osd pool force-recovery rbd 2026-03-31T20:26:30.919 INFO:tasks.workunit.client.0.vm03.stderr:pg 2.0 doesn't require recovery; pg 2.1 doesn't require recovery; pg 2.2 doesn't require recovery; pg 2.3 doesn't require recovery; pg 2.4 doesn't require recovery; pg 2.5 doesn't require recovery; pg 2.6 doesn't require recovery; pg 2.7 doesn't require recovery; 2026-03-31T20:26:30.919 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:26:30.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1572: test_mon_osd: ceph osd pool cancel-force-recovery rbd 2026-03-31T20:26:31.125 INFO:tasks.workunit.client.0.vm03.stderr:pg 2.0 recovery not forced; pg 2.1 recovery not forced; pg 2.2 recovery not forced; pg 2.3 recovery not forced; pg 2.4 recovery not forced; pg 2.5 recovery not forced; pg 2.6 recovery not forced; pg 2.7 recovery not forced; 2026-03-31T20:26:31.125 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:26:31.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1573: test_mon_osd: ceph osd pool force-backfill rbd 2026-03-31T20:26:31.329 INFO:tasks.workunit.client.0.vm03.stderr:pg 2.0 doesn't require backfilling; pg 2.1 doesn't require backfilling; pg 2.2 doesn't require backfilling; pg 2.3 doesn't require backfilling; pg 2.4 doesn't require backfilling; pg 2.5 doesn't require backfilling; pg 2.6 doesn't require backfilling; pg 2.7 doesn't require backfilling; 2026-03-31T20:26:31.329 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:26:31.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1574: test_mon_osd: ceph osd pool cancel-force-backfill rbd 2026-03-31T20:26:31.532 INFO:tasks.workunit.client.0.vm03.stderr:pg 2.0 backfill not forced; pg 2.1 backfill not forced; pg 2.2 backfill not forced; pg 2.3 backfill not forced; pg 2.4 backfill not forced; pg 2.5 backfill not forced; pg 2.6 backfill not forced; pg 2.7 backfill not forced; 2026-03-31T20:26:31.532 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:26:31.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1577: test_mon_osd: for f in noup nodown noin noout noscrub nodeep-scrub nobackfill norebalance norecover notieragent noautoscale 2026-03-31T20:26:31.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1580: test_mon_osd: ceph osd set noup 2026-03-31T20:26:32.746 INFO:tasks.workunit.client.0.vm03.stderr:noup is set 2026-03-31T20:26:32.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1581: test_mon_osd: ceph osd unset noup 2026-03-31T20:26:34.760 INFO:tasks.workunit.client.0.vm03.stderr:noup is unset 2026-03-31T20:26:34.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1577: test_mon_osd: for f in noup nodown noin noout noscrub nodeep-scrub nobackfill norebalance norecover notieragent noautoscale 2026-03-31T20:26:34.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1580: test_mon_osd: ceph osd set nodown 2026-03-31T20:26:36.773 INFO:tasks.workunit.client.0.vm03.stderr:nodown is set 2026-03-31T20:26:36.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1581: test_mon_osd: ceph osd unset nodown 2026-03-31T20:26:38.784 INFO:tasks.workunit.client.0.vm03.stderr:nodown is unset 2026-03-31T20:26:38.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1577: test_mon_osd: for f in noup nodown noin noout noscrub nodeep-scrub nobackfill norebalance norecover notieragent noautoscale 2026-03-31T20:26:38.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1580: test_mon_osd: ceph osd set noin 2026-03-31T20:26:40.797 INFO:tasks.workunit.client.0.vm03.stderr:noin is set 2026-03-31T20:26:40.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1581: test_mon_osd: ceph osd unset noin 2026-03-31T20:26:42.808 INFO:tasks.workunit.client.0.vm03.stderr:noin is unset 2026-03-31T20:26:42.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1577: test_mon_osd: for f in noup nodown noin noout noscrub nodeep-scrub nobackfill norebalance norecover notieragent noautoscale 2026-03-31T20:26:42.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1580: test_mon_osd: ceph osd set noout 2026-03-31T20:26:44.824 INFO:tasks.workunit.client.0.vm03.stderr:noout is set 2026-03-31T20:26:44.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1581: test_mon_osd: ceph osd unset noout 2026-03-31T20:26:46.840 INFO:tasks.workunit.client.0.vm03.stderr:noout is unset 2026-03-31T20:26:46.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1577: test_mon_osd: for f in noup nodown noin noout noscrub nodeep-scrub nobackfill norebalance norecover notieragent noautoscale 2026-03-31T20:26:46.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1580: test_mon_osd: ceph osd set noscrub 2026-03-31T20:26:48.849 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-31T20:26:48.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1581: test_mon_osd: ceph osd unset noscrub 2026-03-31T20:26:50.863 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is unset 2026-03-31T20:26:50.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1577: test_mon_osd: for f in noup nodown noin noout noscrub nodeep-scrub nobackfill norebalance norecover notieragent noautoscale 2026-03-31T20:26:50.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1580: test_mon_osd: ceph osd set nodeep-scrub 2026-03-31T20:26:52.878 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-31T20:26:52.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1581: test_mon_osd: ceph osd unset nodeep-scrub 2026-03-31T20:26:54.894 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is unset 2026-03-31T20:26:54.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1577: test_mon_osd: for f in noup nodown noin noout noscrub nodeep-scrub nobackfill norebalance norecover notieragent noautoscale 2026-03-31T20:26:54.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1580: test_mon_osd: ceph osd set nobackfill 2026-03-31T20:26:56.905 INFO:tasks.workunit.client.0.vm03.stderr:nobackfill is set 2026-03-31T20:26:56.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1581: test_mon_osd: ceph osd unset nobackfill 2026-03-31T20:26:58.919 INFO:tasks.workunit.client.0.vm03.stderr:nobackfill is unset 2026-03-31T20:26:58.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1577: test_mon_osd: for f in noup nodown noin noout noscrub nodeep-scrub nobackfill norebalance norecover notieragent noautoscale 2026-03-31T20:26:58.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1580: test_mon_osd: ceph osd set norebalance 2026-03-31T20:27:00.935 INFO:tasks.workunit.client.0.vm03.stderr:norebalance is set 2026-03-31T20:27:00.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1581: test_mon_osd: ceph osd unset norebalance 2026-03-31T20:27:02.955 INFO:tasks.workunit.client.0.vm03.stderr:norebalance is unset 2026-03-31T20:27:02.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1577: test_mon_osd: for f in noup nodown noin noout noscrub nodeep-scrub nobackfill norebalance norecover notieragent noautoscale 2026-03-31T20:27:02.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1580: test_mon_osd: ceph osd set norecover 2026-03-31T20:27:04.975 INFO:tasks.workunit.client.0.vm03.stderr:norecover is set 2026-03-31T20:27:04.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1581: test_mon_osd: ceph osd unset norecover 2026-03-31T20:27:06.986 INFO:tasks.workunit.client.0.vm03.stderr:norecover is unset 2026-03-31T20:27:06.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1577: test_mon_osd: for f in noup nodown noin noout noscrub nodeep-scrub nobackfill norebalance norecover notieragent noautoscale 2026-03-31T20:27:06.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1580: test_mon_osd: ceph osd set notieragent 2026-03-31T20:27:09.000 INFO:tasks.workunit.client.0.vm03.stderr:notieragent is set 2026-03-31T20:27:09.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1581: test_mon_osd: ceph osd unset notieragent 2026-03-31T20:27:11.014 INFO:tasks.workunit.client.0.vm03.stderr:notieragent is unset 2026-03-31T20:27:11.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1577: test_mon_osd: for f in noup nodown noin noout noscrub nodeep-scrub nobackfill norebalance norecover notieragent noautoscale 2026-03-31T20:27:11.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1580: test_mon_osd: ceph osd set noautoscale 2026-03-31T20:27:13.030 INFO:tasks.workunit.client.0.vm03.stderr:noautoscale is set 2026-03-31T20:27:13.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1581: test_mon_osd: ceph osd unset noautoscale 2026-03-31T20:27:15.046 INFO:tasks.workunit.client.0.vm03.stderr:noautoscale is unset 2026-03-31T20:27:15.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1583: test_mon_osd: expect_false ceph osd set bogus 2026-03-31T20:27:15.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:15.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd set bogus 2026-03-31T20:27:15.204 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: bogus not in full|pause|noup|nodown|noout|noin|nobackfill|norebalance|norecover|noscrub|nodeep-scrub|notieragent|nosnaptrim|pglog_hardlimit|noautoscale 2026-03-31T20:27:15.204 INFO:tasks.workunit.client.0.vm03.stderr:osd set [--yes-i-really-mean-it] : set 2026-03-31T20:27:15.204 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:27:15.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:15.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1584: test_mon_osd: expect_false ceph osd unset bogus 2026-03-31T20:27:15.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:15.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd unset bogus 2026-03-31T20:27:15.358 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: bogus not in full|pause|noup|nodown|noout|noin|nobackfill|norebalance|norecover|noscrub|nodeep-scrub|notieragent|nosnaptrim|noautoscale 2026-03-31T20:27:15.358 INFO:tasks.workunit.client.0.vm03.stderr:osd unset : unset 2026-03-31T20:27:15.358 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:27:15.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:15.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1585: test_mon_osd: for f in sortbitwise recover_deletes require_jewel_osds require_kraken_osds 2026-03-31T20:27:15.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1588: test_mon_osd: expect_false ceph osd set sortbitwise 2026-03-31T20:27:15.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:15.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd set sortbitwise 2026-03-31T20:27:15.512 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: sortbitwise not in full|pause|noup|nodown|noout|noin|nobackfill|norebalance|norecover|noscrub|nodeep-scrub|notieragent|nosnaptrim|pglog_hardlimit|noautoscale 2026-03-31T20:27:15.512 INFO:tasks.workunit.client.0.vm03.stderr:osd set [--yes-i-really-mean-it] : set 2026-03-31T20:27:15.512 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:27:15.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:15.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1589: test_mon_osd: expect_false ceph osd unset sortbitwise 2026-03-31T20:27:15.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:15.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd unset sortbitwise 2026-03-31T20:27:15.666 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: sortbitwise not in full|pause|noup|nodown|noout|noin|nobackfill|norebalance|norecover|noscrub|nodeep-scrub|notieragent|nosnaptrim|noautoscale 2026-03-31T20:27:15.666 INFO:tasks.workunit.client.0.vm03.stderr:osd unset : unset 2026-03-31T20:27:15.666 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:27:15.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:15.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1585: test_mon_osd: for f in sortbitwise recover_deletes require_jewel_osds require_kraken_osds 2026-03-31T20:27:15.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1588: test_mon_osd: expect_false ceph osd set recover_deletes 2026-03-31T20:27:15.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:15.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd set recover_deletes 2026-03-31T20:27:15.820 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: recover_deletes not in full|pause|noup|nodown|noout|noin|nobackfill|norebalance|norecover|noscrub|nodeep-scrub|notieragent|nosnaptrim|pglog_hardlimit|noautoscale 2026-03-31T20:27:15.820 INFO:tasks.workunit.client.0.vm03.stderr:osd set [--yes-i-really-mean-it] : set 2026-03-31T20:27:15.820 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:27:15.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:15.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1589: test_mon_osd: expect_false ceph osd unset recover_deletes 2026-03-31T20:27:15.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:15.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd unset recover_deletes 2026-03-31T20:27:15.977 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: recover_deletes not in full|pause|noup|nodown|noout|noin|nobackfill|norebalance|norecover|noscrub|nodeep-scrub|notieragent|nosnaptrim|noautoscale 2026-03-31T20:27:15.977 INFO:tasks.workunit.client.0.vm03.stderr:osd unset : unset 2026-03-31T20:27:15.977 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:27:15.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:15.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1585: test_mon_osd: for f in sortbitwise recover_deletes require_jewel_osds require_kraken_osds 2026-03-31T20:27:15.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1588: test_mon_osd: expect_false ceph osd set require_jewel_osds 2026-03-31T20:27:15.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:15.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd set require_jewel_osds 2026-03-31T20:27:16.131 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: require_jewel_osds not in full|pause|noup|nodown|noout|noin|nobackfill|norebalance|norecover|noscrub|nodeep-scrub|notieragent|nosnaptrim|pglog_hardlimit|noautoscale 2026-03-31T20:27:16.131 INFO:tasks.workunit.client.0.vm03.stderr:osd set [--yes-i-really-mean-it] : set 2026-03-31T20:27:16.131 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:27:16.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:16.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1589: test_mon_osd: expect_false ceph osd unset require_jewel_osds 2026-03-31T20:27:16.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:16.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd unset require_jewel_osds 2026-03-31T20:27:16.284 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: require_jewel_osds not in full|pause|noup|nodown|noout|noin|nobackfill|norebalance|norecover|noscrub|nodeep-scrub|notieragent|nosnaptrim|noautoscale 2026-03-31T20:27:16.284 INFO:tasks.workunit.client.0.vm03.stderr:osd unset : unset 2026-03-31T20:27:16.284 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:27:16.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:16.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1585: test_mon_osd: for f in sortbitwise recover_deletes require_jewel_osds require_kraken_osds 2026-03-31T20:27:16.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1588: test_mon_osd: expect_false ceph osd set require_kraken_osds 2026-03-31T20:27:16.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:16.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd set require_kraken_osds 2026-03-31T20:27:16.434 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: require_kraken_osds not in full|pause|noup|nodown|noout|noin|nobackfill|norebalance|norecover|noscrub|nodeep-scrub|notieragent|nosnaptrim|pglog_hardlimit|noautoscale 2026-03-31T20:27:16.434 INFO:tasks.workunit.client.0.vm03.stderr:osd set [--yes-i-really-mean-it] : set 2026-03-31T20:27:16.434 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:27:16.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:16.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1589: test_mon_osd: expect_false ceph osd unset require_kraken_osds 2026-03-31T20:27:16.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:16.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd unset require_kraken_osds 2026-03-31T20:27:16.584 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: require_kraken_osds not in full|pause|noup|nodown|noout|noin|nobackfill|norebalance|norecover|noscrub|nodeep-scrub|notieragent|nosnaptrim|noautoscale 2026-03-31T20:27:16.584 INFO:tasks.workunit.client.0.vm03.stderr:osd unset : unset 2026-03-31T20:27:16.584 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:27:16.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:16.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1591: test_mon_osd: ceph osd require-osd-release tentacle 2026-03-31T20:27:16.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1593: test_mon_osd: expect_false ceph osd require-osd-release squid 2026-03-31T20:27:16.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:16.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd require-osd-release squid 2026-03-31T20:27:16.945 INFO:tasks.workunit.client.0.vm03.stderr:Error EPERM: require_osd_release cannot be lowered once it has been set 2026-03-31T20:27:16.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:16.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1594: test_mon_osd: expect_false ceph osd require-osd-release reef 2026-03-31T20:27:16.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:16.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd require-osd-release reef 2026-03-31T20:27:17.097 INFO:tasks.workunit.client.0.vm03.stderr:Error EPERM: require_osd_release cannot be lowered once it has been set 2026-03-31T20:27:17.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:17.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1597: test_mon_osd: ceph osd set noup 2026-03-31T20:27:19.073 INFO:tasks.workunit.client.0.vm03.stderr:noup is set 2026-03-31T20:27:19.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1598: test_mon_osd: ceph osd down 0 2026-03-31T20:27:19.235 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:27:19.230+0000 7f8708384640 -1 mon.a@0(leader).osd e218 definitely_dead 0 2026-03-31T20:27:20.133 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:27:20.130+0000 7f8708384640 -1 mon.a@0(leader).osd e219 definitely_dead 0 2026-03-31T20:27:20.134 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 is already down. 2026-03-31T20:27:20.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1599: test_mon_osd: ceph osd dump 2026-03-31T20:27:20.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1599: test_mon_osd: grep 'osd.0 down' 2026-03-31T20:27:20.352 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 down in weight 1 up_from 8 up_thru 132 down_at 219 last_clean_interval [0,0) v2:192.168.123.103:6804/950776786 v2:192.168.123.103:6805/950776786 exists f3bff22c-a21c-470a-aec1-c158b49523b8 2026-03-31T20:27:20.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1600: test_mon_osd: ceph osd unset noup 2026-03-31T20:27:20.423 INFO:tasks.ceph.osd.0.vm03.stderr:2026-03-31T20:27:20.418+0000 7fe3b39d9640 -1 osd.0 219 osdmap NOUP flag is set, waiting for it to clear 2026-03-31T20:27:21.088 INFO:tasks.ceph.osd.0.vm03.stderr:2026-03-31T20:27:21.082+0000 7fe3ae792640 -1 osd.0 220 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-31T20:27:22.086 INFO:tasks.workunit.client.0.vm03.stderr:noup is unset 2026-03-31T20:27:22.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1601: test_mon_osd: max_run=1000 2026-03-31T20:27:22.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1602: test_mon_osd: (( i=0 )) 2026-03-31T20:27:22.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1602: test_mon_osd: (( i < 1000 )) 2026-03-31T20:27:22.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1603: test_mon_osd: ceph osd dump 2026-03-31T20:27:22.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1603: test_mon_osd: grep 'osd.0 up' 2026-03-31T20:27:22.308 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 221 up_thru 132 down_at 219 last_clean_interval [8,220) v2:192.168.123.103:6804/950776786 v2:192.168.123.103:6813/951776786 exists,up f3bff22c-a21c-470a-aec1-c158b49523b8 2026-03-31T20:27:22.309 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1607: test_mon_osd: break 2026-03-31T20:27:22.309 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1610: test_mon_osd: ceph osd dump 2026-03-31T20:27:22.309 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1610: test_mon_osd: grep 'osd.0 up' 2026-03-31T20:27:22.513 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 221 up_thru 132 down_at 219 last_clean_interval [8,220) v2:192.168.123.103:6804/950776786 v2:192.168.123.103:6813/951776786 exists,up f3bff22c-a21c-470a-aec1-c158b49523b8 2026-03-31T20:27:22.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1612: test_mon_osd: ceph osd dump 2026-03-31T20:27:22.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1612: test_mon_osd: grep 'osd.0 up' 2026-03-31T20:27:22.717 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 221 up_thru 132 down_at 219 last_clean_interval [8,220) v2:192.168.123.103:6804/950776786 v2:192.168.123.103:6813/951776786 exists,up f3bff22c-a21c-470a-aec1-c158b49523b8 2026-03-31T20:27:22.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1614: test_mon_osd: ceph osd find 1 2026-03-31T20:27:22.909 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:27:22.909 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-31T20:27:22.909 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:27:22.909 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:27:22.909 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:27:22.909 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:27:22.909 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6800", 2026-03-31T20:27:22.909 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 3578452477 2026-03-31T20:27:22.909 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:27:22.909 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:27:22.909 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:27:22.909 INFO:tasks.workunit.client.0.vm03.stdout: "osd_fsid": "ea32c3bc-b054-4d75-ba06-731b384d7a6a", 2026-03-31T20:27:22.909 INFO:tasks.workunit.client.0.vm03.stdout: "host": "vm03", 2026-03-31T20:27:22.909 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": { 2026-03-31T20:27:22.909 INFO:tasks.workunit.client.0.vm03.stdout: "host": "vm03", 2026-03-31T20:27:22.909 INFO:tasks.workunit.client.0.vm03.stdout: "root": "default" 2026-03-31T20:27:22.909 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:27:22.909 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:27:22.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1615: test_mon_osd: ceph osd find osd.1 2026-03-31T20:27:23.120 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:27:23.120 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-31T20:27:23.120 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:27:23.120 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:27:23.120 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:27:23.120 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:27:23.120 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6800", 2026-03-31T20:27:23.120 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 3578452477 2026-03-31T20:27:23.120 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:27:23.120 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:27:23.120 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:27:23.120 INFO:tasks.workunit.client.0.vm03.stdout: "osd_fsid": "ea32c3bc-b054-4d75-ba06-731b384d7a6a", 2026-03-31T20:27:23.120 INFO:tasks.workunit.client.0.vm03.stdout: "host": "vm03", 2026-03-31T20:27:23.120 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": { 2026-03-31T20:27:23.120 INFO:tasks.workunit.client.0.vm03.stdout: "host": "vm03", 2026-03-31T20:27:23.120 INFO:tasks.workunit.client.0.vm03.stdout: "root": "default" 2026-03-31T20:27:23.120 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:27:23.120 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:27:23.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1616: test_mon_osd: expect_false ceph osd find osd.xyz 2026-03-31T20:27:23.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:23.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd find osd.xyz 2026-03-31T20:27:23.277 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: osd id xyz not integer 2026-03-31T20:27:23.277 INFO:tasks.workunit.client.0.vm03.stderr:osd find : find osd in the CRUSH map and show its location 2026-03-31T20:27:23.277 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:27:23.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:23.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1617: test_mon_osd: expect_false ceph osd find xyz 2026-03-31T20:27:23.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:23.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd find xyz 2026-03-31T20:27:23.426 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: osd id xyz not integer 2026-03-31T20:27:23.426 INFO:tasks.workunit.client.0.vm03.stderr:osd find : find osd in the CRUSH map and show its location 2026-03-31T20:27:23.426 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:27:23.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:23.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1618: test_mon_osd: expect_false ceph osd find 0.1 2026-03-31T20:27:23.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:23.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd find 0.1 2026-03-31T20:27:23.576 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: unknown type 0 2026-03-31T20:27:23.576 INFO:tasks.workunit.client.0.vm03.stderr:osd find : find osd in the CRUSH map and show its location 2026-03-31T20:27:23.576 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:27:23.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:23.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1619: test_mon_osd: ceph --format plain osd find 1 2026-03-31T20:27:23.775 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:27:23.775 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-31T20:27:23.775 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:27:23.775 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:27:23.775 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:27:23.775 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:27:23.775 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6800", 2026-03-31T20:27:23.775 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 3578452477 2026-03-31T20:27:23.775 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:27:23.775 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:27:23.775 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:27:23.775 INFO:tasks.workunit.client.0.vm03.stdout: "osd_fsid": "ea32c3bc-b054-4d75-ba06-731b384d7a6a", 2026-03-31T20:27:23.775 INFO:tasks.workunit.client.0.vm03.stdout: "host": "vm03", 2026-03-31T20:27:23.775 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": { 2026-03-31T20:27:23.775 INFO:tasks.workunit.client.0.vm03.stdout: "host": "vm03", 2026-03-31T20:27:23.775 INFO:tasks.workunit.client.0.vm03.stdout: "root": "default" 2026-03-31T20:27:23.775 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:27:23.775 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:27:23.786 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1620: test_mon_osd: uname 2026-03-31T20:27:23.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1620: test_mon_osd: '[' Linux == Linux ']' 2026-03-31T20:27:23.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1621: test_mon_osd: ceph osd metadata 1 2026-03-31T20:27:23.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1621: test_mon_osd: grep distro 2026-03-31T20:27:23.990 INFO:tasks.workunit.client.0.vm03.stdout: "distro": "ubuntu", 2026-03-31T20:27:23.990 INFO:tasks.workunit.client.0.vm03.stdout: "distro_description": "Ubuntu 22.04.5 LTS", 2026-03-31T20:27:23.990 INFO:tasks.workunit.client.0.vm03.stdout: "distro_version": "22.04", 2026-03-31T20:27:23.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1622: test_mon_osd: ceph --format plain osd metadata 1 2026-03-31T20:27:23.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1622: test_mon_osd: grep distro 2026-03-31T20:27:24.196 INFO:tasks.workunit.client.0.vm03.stdout: "distro": "ubuntu", 2026-03-31T20:27:24.196 INFO:tasks.workunit.client.0.vm03.stdout: "distro_description": "Ubuntu 22.04.5 LTS", 2026-03-31T20:27:24.196 INFO:tasks.workunit.client.0.vm03.stdout: "distro_version": "22.04", 2026-03-31T20:27:24.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1624: test_mon_osd: ceph osd out 0 2026-03-31T20:27:24.343 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:27:24.338+0000 7f8708384640 -1 mon.a@0(leader).osd e222 definitely_dead 0 2026-03-31T20:27:25.156 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:27:25.150+0000 7f8708384640 -1 mon.a@0(leader).osd e223 definitely_dead 0 2026-03-31T20:27:25.157 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 is already out. 2026-03-31T20:27:25.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1625: test_mon_osd: grep 'osd.0.*out' 2026-03-31T20:27:25.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1625: test_mon_osd: ceph osd dump 2026-03-31T20:27:25.370 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up out weight 0 up_from 221 up_thru 221 down_at 219 last_clean_interval [8,220) v2:192.168.123.103:6804/950776786 v2:192.168.123.103:6813/951776786 exists,up f3bff22c-a21c-470a-aec1-c158b49523b8 2026-03-31T20:27:25.370 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1626: test_mon_osd: ceph osd in 0 2026-03-31T20:27:25.517 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:27:25.514+0000 7f8708384640 -1 mon.a@0(leader).osd e223 definitely_dead 0 2026-03-31T20:27:26.165 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:27:26.162+0000 7f8708384640 -1 mon.a@0(leader).osd e224 definitely_dead 0 2026-03-31T20:27:26.166 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 is already in. 2026-03-31T20:27:26.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1627: test_mon_osd: grep 'osd.0.*in' 2026-03-31T20:27:26.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1627: test_mon_osd: ceph osd dump 2026-03-31T20:27:26.383 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 221 up_thru 221 down_at 219 last_clean_interval [8,220) v2:192.168.123.103:6804/950776786 v2:192.168.123.103:6813/951776786 exists,up f3bff22c-a21c-470a-aec1-c158b49523b8 2026-03-31T20:27:26.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1628: test_mon_osd: ceph osd find 0 2026-03-31T20:27:26.575 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:27:26.575 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-31T20:27:26.575 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:27:26.575 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:27:26.575 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:27:26.575 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:27:26.575 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6804", 2026-03-31T20:27:26.575 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 950776786 2026-03-31T20:27:26.575 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:27:26.575 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:27:26.575 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:27:26.575 INFO:tasks.workunit.client.0.vm03.stdout: "osd_fsid": "f3bff22c-a21c-470a-aec1-c158b49523b8", 2026-03-31T20:27:26.575 INFO:tasks.workunit.client.0.vm03.stdout: "host": "vm03", 2026-03-31T20:27:26.575 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": { 2026-03-31T20:27:26.575 INFO:tasks.workunit.client.0.vm03.stdout: "host": "vm03", 2026-03-31T20:27:26.575 INFO:tasks.workunit.client.0.vm03.stdout: "root": "default" 2026-03-31T20:27:26.575 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:27:26.575 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:27:26.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1630: test_mon_osd: ceph osd info 0 2026-03-31T20:27:26.777 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 221 up_thru 221 down_at 219 last_clean_interval [8,220) v2:192.168.123.103:6804/950776786 v2:192.168.123.103:6813/951776786 exists,up f3bff22c-a21c-470a-aec1-c158b49523b8 2026-03-31T20:27:26.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1631: test_mon_osd: ceph osd info osd.0 2026-03-31T20:27:26.980 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 221 up_thru 221 down_at 219 last_clean_interval [8,220) v2:192.168.123.103:6804/950776786 v2:192.168.123.103:6813/951776786 exists,up f3bff22c-a21c-470a-aec1-c158b49523b8 2026-03-31T20:27:26.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1632: test_mon_osd: expect_false ceph osd info osd.xyz 2026-03-31T20:27:26.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:26.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd info osd.xyz 2026-03-31T20:27:27.146 INFO:tasks.workunit.client.0.vm03.stderr:osd.xyz not valid: osd id xyz not integer 2026-03-31T20:27:27.146 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: unused arguments: ['osd.xyz'] 2026-03-31T20:27:27.146 INFO:tasks.workunit.client.0.vm03.stderr:osd info [] : print osd's {id} information (instead of all osds from map) 2026-03-31T20:27:27.146 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:27:27.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:27.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1633: test_mon_osd: expect_false ceph osd info xyz 2026-03-31T20:27:27.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:27.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd info xyz 2026-03-31T20:27:27.294 INFO:tasks.workunit.client.0.vm03.stderr:xyz not valid: osd id xyz not integer 2026-03-31T20:27:27.294 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: unused arguments: ['xyz'] 2026-03-31T20:27:27.294 INFO:tasks.workunit.client.0.vm03.stderr:osd info [] : print osd's {id} information (instead of all osds from map) 2026-03-31T20:27:27.294 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:27:27.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:27.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1634: test_mon_osd: expect_false ceph osd info 42 2026-03-31T20:27:27.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:27.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd info 42 2026-03-31T20:27:27.441 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: osd.42 does not exist 2026-03-31T20:27:27.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:27.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1635: test_mon_osd: expect_false ceph osd info osd.42 2026-03-31T20:27:27.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:27.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd info osd.42 2026-03-31T20:27:27.589 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: osd.42 does not exist 2026-03-31T20:27:27.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:27.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1637: test_mon_osd: ceph osd info 2026-03-31T20:27:27.787 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 221 up_thru 224 down_at 219 last_clean_interval [8,220) v2:192.168.123.103:6804/950776786 v2:192.168.123.103:6813/951776786 exists,up f3bff22c-a21c-470a-aec1-c158b49523b8 2026-03-31T20:27:27.787 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 8 up_thru 224 down_at 0 last_clean_interval [0,0) v2:192.168.123.103:6800/3578452477 v2:192.168.123.103:6801/3578452477 exists,up ea32c3bc-b054-4d75-ba06-731b384d7a6a 2026-03-31T20:27:27.787 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 8 up_thru 132 down_at 0 last_clean_interval [0,0) v2:192.168.123.103:6808/1523795840 v2:192.168.123.103:6809/1523795840 exists,up b8c0450a-00e1-4ca8-b0ac-3f046a80f1b8 2026-03-31T20:27:27.799 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1638: test_mon_osd: ceph osd info --format=json 2026-03-31T20:27:27.799 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1638: test_mon_osd: jq -cM . 2026-03-31T20:27:28.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1638: test_mon_osd: info_json='[{"osd":0,"uuid":"f3bff22c-a21c-470a-aec1-c158b49523b8","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":8,"last_clean_end":220,"up_from":221,"up_thru":224,"down_at":219,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6804","nonce":950776786}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6813","nonce":951776786}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6807","nonce":950776786}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6806","nonce":950776786}]},"public_addr":"192.168.123.103:6804/950776786","cluster_addr":"192.168.123.103:6813/951776786","heartbeat_back_addr":"192.168.123.103:6807/950776786","heartbeat_front_addr":"192.168.123.103:6806/950776786","state":["exists","up"]},{"osd":1,"uuid":"ea32c3bc-b054-4d75-ba06-731b384d7a6a","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":224,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6800","nonce":3578452477}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6801","nonce":3578452477}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6803","nonce":3578452477}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6802","nonce":3578452477}]},"public_addr":"192.168.123.103:6800/3578452477","cluster_addr":"192.168.123.103:6801/3578452477","heartbeat_back_addr":"192.168.123.103:6803/3578452477","heartbeat_front_addr":"192.168.123.103:6802/3578452477","state":["exists","up"]},{"osd":2,"uuid":"b8c0450a-00e1-4ca8-b0ac-3f046a80f1b8","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":132,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6808","nonce":1523795840}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6809","nonce":1523795840}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6811","nonce":1523795840}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6810","nonce":1523795840}]},"public_addr":"192.168.123.103:6808/1523795840","cluster_addr":"192.168.123.103:6809/1523795840","heartbeat_back_addr":"192.168.123.103:6811/1523795840","heartbeat_front_addr":"192.168.123.103:6810/1523795840","state":["exists","up"]}]' 2026-03-31T20:27:28.007 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1639: test_mon_osd: ceph osd dump --format=json 2026-03-31T20:27:28.008 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1639: test_mon_osd: jq -cM .osds 2026-03-31T20:27:28.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1639: test_mon_osd: dump_json='[{"osd":0,"uuid":"f3bff22c-a21c-470a-aec1-c158b49523b8","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":8,"last_clean_end":220,"up_from":221,"up_thru":224,"down_at":219,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6804","nonce":950776786}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6813","nonce":951776786}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6807","nonce":950776786}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6806","nonce":950776786}]},"public_addr":"192.168.123.103:6804/950776786","cluster_addr":"192.168.123.103:6813/951776786","heartbeat_back_addr":"192.168.123.103:6807/950776786","heartbeat_front_addr":"192.168.123.103:6806/950776786","state":["exists","up"]},{"osd":1,"uuid":"ea32c3bc-b054-4d75-ba06-731b384d7a6a","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":224,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6800","nonce":3578452477}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6801","nonce":3578452477}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6803","nonce":3578452477}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6802","nonce":3578452477}]},"public_addr":"192.168.123.103:6800/3578452477","cluster_addr":"192.168.123.103:6801/3578452477","heartbeat_back_addr":"192.168.123.103:6803/3578452477","heartbeat_front_addr":"192.168.123.103:6802/3578452477","state":["exists","up"]},{"osd":2,"uuid":"b8c0450a-00e1-4ca8-b0ac-3f046a80f1b8","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":132,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6808","nonce":1523795840}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6809","nonce":1523795840}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6811","nonce":1523795840}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6810","nonce":1523795840}]},"public_addr":"192.168.123.103:6808/1523795840","cluster_addr":"192.168.123.103:6809/1523795840","heartbeat_back_addr":"192.168.123.103:6811/1523795840","heartbeat_front_addr":"192.168.123.103:6810/1523795840","state":["exists","up"]}]' 2026-03-31T20:27:28.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1640: test_mon_osd: [[ [{"osd":0,"uuid":"f3bff22c-a21c-470a-aec1-c158b49523b8","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":8,"last_clean_end":220,"up_from":221,"up_thru":224,"down_at":219,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6804","nonce":950776786}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6813","nonce":951776786}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6807","nonce":950776786}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6806","nonce":950776786}]},"public_addr":"192.168.123.103:6804/950776786","cluster_addr":"192.168.123.103:6813/951776786","heartbeat_back_addr":"192.168.123.103:6807/950776786","heartbeat_front_addr":"192.168.123.103:6806/950776786","state":["exists","up"]},{"osd":1,"uuid":"ea32c3bc-b054-4d75-ba06-731b384d7a6a","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":224,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6800","nonce":3578452477}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6801","nonce":3578452477}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6803","nonce":3578452477}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6802","nonce":3578452477}]},"public_addr":"192.168.123.103:6800/3578452477","cluster_addr":"192.168.123.103:6801/3578452477","heartbeat_back_addr":"192.168.123.103:6803/3578452477","heartbeat_front_addr":"192.168.123.103:6802/3578452477","state":["exists","up"]},{"osd":2,"uuid":"b8c0450a-00e1-4ca8-b0ac-3f046a80f1b8","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":132,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6808","nonce":1523795840}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6809","nonce":1523795840}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6811","nonce":1523795840}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6810","nonce":1523795840}]},"public_addr":"192.168.123.103:6808/1523795840","cluster_addr":"192.168.123.103:6809/1523795840","heartbeat_back_addr":"192.168.123.103:6811/1523795840","heartbeat_front_addr":"192.168.123.103:6810/1523795840","state":["exists","up"]}] != \[\{\"\o\s\d\"\:\0\,\"\u\u\i\d\"\:\"\f\3\b\f\f\2\2\c\-\a\2\1\c\-\4\7\0\a\-\a\e\c\1\-\c\1\5\8\b\4\9\5\2\3\b\8\"\,\"\u\p\"\:\1\,\"\i\n\"\:\1\,\"\w\e\i\g\h\t\"\:\1\,\"\p\r\i\m\a\r\y\_\a\f\f\i\n\i\t\y\"\:\1\,\"\l\a\s\t\_\c\l\e\a\n\_\b\e\g\i\n\"\:\8\,\"\l\a\s\t\_\c\l\e\a\n\_\e\n\d\"\:\2\2\0\,\"\u\p\_\f\r\o\m\"\:\2\2\1\,\"\u\p\_\t\h\r\u\"\:\2\2\4\,\"\d\o\w\n\_\a\t\"\:\2\1\9\,\"\l\o\s\t\_\a\t\"\:\0\,\"\p\u\b\l\i\c\_\a\d\d\r\s\"\:\{\"\a\d\d\r\v\e\c\"\:\[\{\"\t\y\p\e\"\:\"\v\2\"\,\"\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\4\"\,\"\n\o\n\c\e\"\:\9\5\0\7\7\6\7\8\6\}\]\}\,\"\c\l\u\s\t\e\r\_\a\d\d\r\s\"\:\{\"\a\d\d\r\v\e\c\"\:\[\{\"\t\y\p\e\"\:\"\v\2\"\,\"\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\1\3\"\,\"\n\o\n\c\e\"\:\9\5\1\7\7\6\7\8\6\}\]\}\,\"\h\e\a\r\t\b\e\a\t\_\b\a\c\k\_\a\d\d\r\s\"\:\{\"\a\d\d\r\v\e\c\"\:\[\{\"\t\y\p\e\"\:\"\v\2\"\,\"\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\7\"\,\"\n\o\n\c\e\"\:\9\5\0\7\7\6\7\8\6\}\]\}\,\"\h\e\a\r\t\b\e\a\t\_\f\r\o\n\t\_\a\d\d\r\s\"\:\{\"\a\d\d\r\v\e\c\"\:\[\{\"\t\y\p\e\"\:\"\v\2\"\,\"\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\6\"\,\"\n\o\n\c\e\"\:\9\5\0\7\7\6\7\8\6\}\]\}\,\"\p\u\b\l\i\c\_\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\4\/\9\5\0\7\7\6\7\8\6\"\,\"\c\l\u\s\t\e\r\_\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\1\3\/\9\5\1\7\7\6\7\8\6\"\,\"\h\e\a\r\t\b\e\a\t\_\b\a\c\k\_\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\7\/\9\5\0\7\7\6\7\8\6\"\,\"\h\e\a\r\t\b\e\a\t\_\f\r\o\n\t\_\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\6\/\9\5\0\7\7\6\7\8\6\"\,\"\s\t\a\t\e\"\:\[\"\e\x\i\s\t\s\"\,\"\u\p\"\]\}\,\{\"\o\s\d\"\:\1\,\"\u\u\i\d\"\:\"\e\a\3\2\c\3\b\c\-\b\0\5\4\-\4\d\7\5\-\b\a\0\6\-\7\3\1\b\3\8\4\d\7\a\6\a\"\,\"\u\p\"\:\1\,\"\i\n\"\:\1\,\"\w\e\i\g\h\t\"\:\1\,\"\p\r\i\m\a\r\y\_\a\f\f\i\n\i\t\y\"\:\1\,\"\l\a\s\t\_\c\l\e\a\n\_\b\e\g\i\n\"\:\0\,\"\l\a\s\t\_\c\l\e\a\n\_\e\n\d\"\:\0\,\"\u\p\_\f\r\o\m\"\:\8\,\"\u\p\_\t\h\r\u\"\:\2\2\4\,\"\d\o\w\n\_\a\t\"\:\0\,\"\l\o\s\t\_\a\t\"\:\0\,\"\p\u\b\l\i\c\_\a\d\d\r\s\"\:\{\"\a\d\d\r\v\e\c\"\:\[\{\"\t\y\p\e\"\:\"\v\2\"\,\"\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\0\"\,\"\n\o\n\c\e\"\:\3\5\7\8\4\5\2\4\7\7\}\]\}\,\"\c\l\u\s\t\e\r\_\a\d\d\r\s\"\:\{\"\a\d\d\r\v\e\c\"\:\[\{\"\t\y\p\e\"\:\"\v\2\"\,\"\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\1\"\,\"\n\o\n\c\e\"\:\3\5\7\8\4\5\2\4\7\7\}\]\}\,\"\h\e\a\r\t\b\e\a\t\_\b\a\c\k\_\a\d\d\r\s\"\:\{\"\a\d\d\r\v\e\c\"\:\[\{\"\t\y\p\e\"\:\"\v\2\"\,\"\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\3\"\,\"\n\o\n\c\e\"\:\3\5\7\8\4\5\2\4\7\7\}\]\}\,\"\h\e\a\r\t\b\e\a\t\_\f\r\o\n\t\_\a\d\d\r\s\"\:\{\"\a\d\d\r\v\e\c\"\:\[\{\"\t\y\p\e\"\:\"\v\2\"\,\"\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\2\"\,\"\n\o\n\c\e\"\:\3\5\7\8\4\5\2\4\7\7\}\]\}\,\"\p\u\b\l\i\c\_\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\0\/\3\5\7\8\4\5\2\4\7\7\"\,\"\c\l\u\s\t\e\r\_\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\1\/\3\5\7\8\4\5\2\4\7\7\"\,\"\h\e\a\r\t\b\e\a\t\_\b\a\c\k\_\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\3\/\3\5\7\8\4\5\2\4\7\7\"\,\"\h\e\a\r\t\b\e\a\t\_\f\r\o\n\t\_\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\2\/\3\5\7\8\4\5\2\4\7\7\"\,\"\s\t\a\t\e\"\:\[\"\e\x\i\s\t\s\"\,\"\u\p\"\]\}\,\{\"\o\s\d\"\:\2\,\"\u\u\i\d\"\:\"\b\8\c\0\4\5\0\a\-\0\0\e\1\-\4\c\a\8\-\b\0\a\c\-\3\f\0\4\6\a\8\0\f\1\b\8\"\,\"\u\p\"\:\1\,\"\i\n\"\:\1\,\"\w\e\i\g\h\t\"\:\1\,\"\p\r\i\m\a\r\y\_\a\f\f\i\n\i\t\y\"\:\1\,\"\l\a\s\t\_\c\l\e\a\n\_\b\e\g\i\n\"\:\0\,\"\l\a\s\t\_\c\l\e\a\n\_\e\n\d\"\:\0\,\"\u\p\_\f\r\o\m\"\:\8\,\"\u\p\_\t\h\r\u\"\:\1\3\2\,\"\d\o\w\n\_\a\t\"\:\0\,\"\l\o\s\t\_\a\t\"\:\0\,\"\p\u\b\l\i\c\_\a\d\d\r\s\"\:\{\"\a\d\d\r\v\e\c\"\:\[\{\"\t\y\p\e\"\:\"\v\2\"\,\"\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\8\"\,\"\n\o\n\c\e\"\:\1\5\2\3\7\9\5\8\4\0\}\]\}\,\"\c\l\u\s\t\e\r\_\a\d\d\r\s\"\:\{\"\a\d\d\r\v\e\c\"\:\[\{\"\t\y\p\e\"\:\"\v\2\"\,\"\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\9\"\,\"\n\o\n\c\e\"\:\1\5\2\3\7\9\5\8\4\0\}\]\}\,\"\h\e\a\r\t\b\e\a\t\_\b\a\c\k\_\a\d\d\r\s\"\:\{\"\a\d\d\r\v\e\c\"\:\[\{\"\t\y\p\e\"\:\"\v\2\"\,\"\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\1\1\"\,\"\n\o\n\c\e\"\:\1\5\2\3\7\9\5\8\4\0\}\]\}\,\"\h\e\a\r\t\b\e\a\t\_\f\r\o\n\t\_\a\d\d\r\s\"\:\{\"\a\d\d\r\v\e\c\"\:\[\{\"\t\y\p\e\"\:\"\v\2\"\,\"\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\1\0\"\,\"\n\o\n\c\e\"\:\1\5\2\3\7\9\5\8\4\0\}\]\}\,\"\p\u\b\l\i\c\_\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\8\/\1\5\2\3\7\9\5\8\4\0\"\,\"\c\l\u\s\t\e\r\_\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\9\/\1\5\2\3\7\9\5\8\4\0\"\,\"\h\e\a\r\t\b\e\a\t\_\b\a\c\k\_\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\1\1\/\1\5\2\3\7\9\5\8\4\0\"\,\"\h\e\a\r\t\b\e\a\t\_\f\r\o\n\t\_\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\1\0\/\1\5\2\3\7\9\5\8\4\0\"\,\"\s\t\a\t\e\"\:\[\"\e\x\i\s\t\s\"\,\"\u\p\"\]\}\] ]] 2026-03-31T20:27:28.219 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1648: test_mon_osd: ceph osd info 0 --format=json 2026-03-31T20:27:28.219 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1648: test_mon_osd: jq -cM . 2026-03-31T20:27:28.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1648: test_mon_osd: info_json='{"osd":0,"uuid":"f3bff22c-a21c-470a-aec1-c158b49523b8","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":8,"last_clean_end":220,"up_from":221,"up_thru":224,"down_at":219,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6804","nonce":950776786}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6813","nonce":951776786}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6807","nonce":950776786}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6806","nonce":950776786}]},"public_addr":"192.168.123.103:6804/950776786","cluster_addr":"192.168.123.103:6813/951776786","heartbeat_back_addr":"192.168.123.103:6807/950776786","heartbeat_front_addr":"192.168.123.103:6806/950776786","state":["exists","up"]}' 2026-03-31T20:27:28.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1650: test_mon_osd: ceph osd dump --format=json 2026-03-31T20:27:28.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1650: test_mon_osd: jq -cM '.osds[] | select(.osd == 0)' 2026-03-31T20:27:28.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1650: test_mon_osd: dump_json='{"osd":0,"uuid":"f3bff22c-a21c-470a-aec1-c158b49523b8","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":8,"last_clean_end":220,"up_from":221,"up_thru":224,"down_at":219,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6804","nonce":950776786}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6813","nonce":951776786}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6807","nonce":950776786}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6806","nonce":950776786}]},"public_addr":"192.168.123.103:6804/950776786","cluster_addr":"192.168.123.103:6813/951776786","heartbeat_back_addr":"192.168.123.103:6807/950776786","heartbeat_front_addr":"192.168.123.103:6806/950776786","state":["exists","up"]}' 2026-03-31T20:27:28.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1651: test_mon_osd: [[ {"osd":0,"uuid":"f3bff22c-a21c-470a-aec1-c158b49523b8","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":8,"last_clean_end":220,"up_from":221,"up_thru":224,"down_at":219,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6804","nonce":950776786}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6813","nonce":951776786}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6807","nonce":950776786}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6806","nonce":950776786}]},"public_addr":"192.168.123.103:6804/950776786","cluster_addr":"192.168.123.103:6813/951776786","heartbeat_back_addr":"192.168.123.103:6807/950776786","heartbeat_front_addr":"192.168.123.103:6806/950776786","state":["exists","up"]} == \{\"\o\s\d\"\:\0\,\"\u\u\i\d\"\:\"\f\3\b\f\f\2\2\c\-\a\2\1\c\-\4\7\0\a\-\a\e\c\1\-\c\1\5\8\b\4\9\5\2\3\b\8\"\,\"\u\p\"\:\1\,\"\i\n\"\:\1\,\"\w\e\i\g\h\t\"\:\1\,\"\p\r\i\m\a\r\y\_\a\f\f\i\n\i\t\y\"\:\1\,\"\l\a\s\t\_\c\l\e\a\n\_\b\e\g\i\n\"\:\8\,\"\l\a\s\t\_\c\l\e\a\n\_\e\n\d\"\:\2\2\0\,\"\u\p\_\f\r\o\m\"\:\2\2\1\,\"\u\p\_\t\h\r\u\"\:\2\2\4\,\"\d\o\w\n\_\a\t\"\:\2\1\9\,\"\l\o\s\t\_\a\t\"\:\0\,\"\p\u\b\l\i\c\_\a\d\d\r\s\"\:\{\"\a\d\d\r\v\e\c\"\:\[\{\"\t\y\p\e\"\:\"\v\2\"\,\"\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\4\"\,\"\n\o\n\c\e\"\:\9\5\0\7\7\6\7\8\6\}\]\}\,\"\c\l\u\s\t\e\r\_\a\d\d\r\s\"\:\{\"\a\d\d\r\v\e\c\"\:\[\{\"\t\y\p\e\"\:\"\v\2\"\,\"\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\1\3\"\,\"\n\o\n\c\e\"\:\9\5\1\7\7\6\7\8\6\}\]\}\,\"\h\e\a\r\t\b\e\a\t\_\b\a\c\k\_\a\d\d\r\s\"\:\{\"\a\d\d\r\v\e\c\"\:\[\{\"\t\y\p\e\"\:\"\v\2\"\,\"\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\7\"\,\"\n\o\n\c\e\"\:\9\5\0\7\7\6\7\8\6\}\]\}\,\"\h\e\a\r\t\b\e\a\t\_\f\r\o\n\t\_\a\d\d\r\s\"\:\{\"\a\d\d\r\v\e\c\"\:\[\{\"\t\y\p\e\"\:\"\v\2\"\,\"\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\6\"\,\"\n\o\n\c\e\"\:\9\5\0\7\7\6\7\8\6\}\]\}\,\"\p\u\b\l\i\c\_\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\4\/\9\5\0\7\7\6\7\8\6\"\,\"\c\l\u\s\t\e\r\_\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\1\3\/\9\5\1\7\7\6\7\8\6\"\,\"\h\e\a\r\t\b\e\a\t\_\b\a\c\k\_\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\7\/\9\5\0\7\7\6\7\8\6\"\,\"\h\e\a\r\t\b\e\a\t\_\f\r\o\n\t\_\a\d\d\r\"\:\"\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\6\/\9\5\0\7\7\6\7\8\6\"\,\"\s\t\a\t\e\"\:\[\"\e\x\i\s\t\s\"\,\"\u\p\"\]\} ]] 2026-03-31T20:27:28.625 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1653: test_mon_osd: ceph osd info 2026-03-31T20:27:28.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1653: test_mon_osd: info_plain='osd.0 up in weight 1 up_from 221 up_thru 224 down_at 219 last_clean_interval [8,220) v2:192.168.123.103:6804/950776786 v2:192.168.123.103:6813/951776786 exists,up f3bff22c-a21c-470a-aec1-c158b49523b8 2026-03-31T20:27:28.827 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 8 up_thru 224 down_at 0 last_clean_interval [0,0) v2:192.168.123.103:6800/3578452477 v2:192.168.123.103:6801/3578452477 exists,up ea32c3bc-b054-4d75-ba06-731b384d7a6a 2026-03-31T20:27:28.827 INFO:tasks.workunit.client.0.vm03.stderr:osd.2 up in weight 1 up_from 8 up_thru 132 down_at 0 last_clean_interval [0,0) v2:192.168.123.103:6808/1523795840 v2:192.168.123.103:6809/1523795840 exists,up b8c0450a-00e1-4ca8-b0ac-3f046a80f1b8' 2026-03-31T20:27:28.827 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1654: test_mon_osd: ceph osd dump 2026-03-31T20:27:28.827 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1654: test_mon_osd: grep '^osd' 2026-03-31T20:27:29.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1654: test_mon_osd: dump_plain='osd.0 up in weight 1 up_from 221 up_thru 224 down_at 219 last_clean_interval [8,220) v2:192.168.123.103:6804/950776786 v2:192.168.123.103:6813/951776786 exists,up f3bff22c-a21c-470a-aec1-c158b49523b8 2026-03-31T20:27:29.032 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 8 up_thru 224 down_at 0 last_clean_interval [0,0) v2:192.168.123.103:6800/3578452477 v2:192.168.123.103:6801/3578452477 exists,up ea32c3bc-b054-4d75-ba06-731b384d7a6a 2026-03-31T20:27:29.032 INFO:tasks.workunit.client.0.vm03.stderr:osd.2 up in weight 1 up_from 8 up_thru 132 down_at 0 last_clean_interval [0,0) v2:192.168.123.103:6808/1523795840 v2:192.168.123.103:6809/1523795840 exists,up b8c0450a-00e1-4ca8-b0ac-3f046a80f1b8' 2026-03-31T20:27:29.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1655: test_mon_osd: [[ osd.0 up in weight 1 up_from 221 up_thru 224 down_at 219 last_clean_interval [8,220) v2:192.168.123.103:6804/950776786 v2:192.168.123.103:6813/951776786 exists,up f3bff22c-a21c-470a-aec1-c158b49523b8 2026-03-31T20:27:29.032 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 8 up_thru 224 down_at 0 last_clean_interval [0,0) v2:192.168.123.103:6800/3578452477 v2:192.168.123.103:6801/3578452477 exists,up ea32c3bc-b054-4d75-ba06-731b384d7a6a 2026-03-31T20:27:29.032 INFO:tasks.workunit.client.0.vm03.stderr:osd.2 up in weight 1 up_from 8 up_thru 132 down_at 0 last_clean_interval [0,0) v2:192.168.123.103:6808/1523795840 v2:192.168.123.103:6809/1523795840 exists,up b8c0450a-00e1-4ca8-b0ac-3f046a80f1b8 == \o\s\d\.\0\ \u\p\ \ \ \i\n\ \ \w\e\i\g\h\t\ \1\ \u\p\_\f\r\o\m\ \2\2\1\ \u\p\_\t\h\r\u\ \2\2\4\ \d\o\w\n\_\a\t\ \2\1\9\ \l\a\s\t\_\c\l\e\a\n\_\i\n\t\e\r\v\a\l\ \[\8\,\2\2\0\)\ \v\2\:\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\4\/\9\5\0\7\7\6\7\8\6\ \v\2\:\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\1\3\/\9\5\1\7\7\6\7\8\6\ \e\x\i\s\t\s\,\u\p\ \f\3\b\f\f\2\2\c\-\a\2\1\c\-\4\7\0\a\-\a\e\c\1\-\c\1\5\8\b\4\9\5\2\3\b\8\ 2026-03-31T20:27:29.032 INFO:tasks.workunit.client.0.vm03.stderr:\o\s\d\.\1\ \u\p\ \ \ \i\n\ \ \w\e\i\g\h\t\ \1\ \u\p\_\f\r\o\m\ \8\ \u\p\_\t\h\r\u\ \2\2\4\ \d\o\w\n\_\a\t\ \0\ \l\a\s\t\_\c\l\e\a\n\_\i\n\t\e\r\v\a\l\ \[\0\,\0\)\ \v\2\:\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\0\/\3\5\7\8\4\5\2\4\7\7\ \v\2\:\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\1\/\3\5\7\8\4\5\2\4\7\7\ \e\x\i\s\t\s\,\u\p\ \e\a\3\2\c\3\b\c\-\b\0\5\4\-\4\d\7\5\-\b\a\0\6\-\7\3\1\b\3\8\4\d\7\a\6\a\ 2026-03-31T20:27:29.032 INFO:tasks.workunit.client.0.vm03.stderr:\o\s\d\.\2\ \u\p\ \ \ \i\n\ \ \w\e\i\g\h\t\ \1\ \u\p\_\f\r\o\m\ \8\ \u\p\_\t\h\r\u\ \1\3\2\ \d\o\w\n\_\a\t\ \0\ \l\a\s\t\_\c\l\e\a\n\_\i\n\t\e\r\v\a\l\ \[\0\,\0\)\ \v\2\:\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\8\/\1\5\2\3\7\9\5\8\4\0\ \v\2\:\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\9\/\1\5\2\3\7\9\5\8\4\0\ \e\x\i\s\t\s\,\u\p\ \b\8\c\0\4\5\0\a\-\0\0\e\1\-\4\c\a\8\-\b\0\a\c\-\3\f\0\4\6\a\8\0\f\1\b\8 ]] 2026-03-31T20:27:29.032 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1657: test_mon_osd: ceph osd info 0 2026-03-31T20:27:29.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1657: test_mon_osd: info_plain='osd.0 up in weight 1 up_from 221 up_thru 224 down_at 219 last_clean_interval [8,220) v2:192.168.123.103:6804/950776786 v2:192.168.123.103:6813/951776786 exists,up f3bff22c-a21c-470a-aec1-c158b49523b8' 2026-03-31T20:27:29.246 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1658: test_mon_osd: ceph osd dump 2026-03-31T20:27:29.246 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1658: test_mon_osd: grep '^osd.0' 2026-03-31T20:27:29.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1658: test_mon_osd: dump_plain='osd.0 up in weight 1 up_from 221 up_thru 224 down_at 219 last_clean_interval [8,220) v2:192.168.123.103:6804/950776786 v2:192.168.123.103:6813/951776786 exists,up f3bff22c-a21c-470a-aec1-c158b49523b8' 2026-03-31T20:27:29.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1659: test_mon_osd: [[ osd.0 up in weight 1 up_from 221 up_thru 224 down_at 219 last_clean_interval [8,220) v2:192.168.123.103:6804/950776786 v2:192.168.123.103:6813/951776786 exists,up f3bff22c-a21c-470a-aec1-c158b49523b8 == \o\s\d\.\0\ \u\p\ \ \ \i\n\ \ \w\e\i\g\h\t\ \1\ \u\p\_\f\r\o\m\ \2\2\1\ \u\p\_\t\h\r\u\ \2\2\4\ \d\o\w\n\_\a\t\ \2\1\9\ \l\a\s\t\_\c\l\e\a\n\_\i\n\t\e\r\v\a\l\ \[\8\,\2\2\0\)\ \v\2\:\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\0\4\/\9\5\0\7\7\6\7\8\6\ \v\2\:\1\9\2\.\1\6\8\.\1\2\3\.\1\0\3\:\6\8\1\3\/\9\5\1\7\7\6\7\8\6\ \e\x\i\s\t\s\,\u\p\ \f\3\b\f\f\2\2\c\-\a\2\1\c\-\4\7\0\a\-\a\e\c\1\-\c\1\5\8\b\4\9\5\2\3\b\8 ]] 2026-03-31T20:27:29.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1661: test_mon_osd: ceph osd add-nodown 0 1 2026-03-31T20:27:30.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1662: test_mon_osd: ceph health detail 2026-03-31T20:27:30.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1662: test_mon_osd: grep NODOWN 2026-03-31T20:27:30.469 INFO:tasks.workunit.client.0.vm03.stdout:HEALTH_WARN 2 OSDs or CRUSH {nodes, device-classes} have {NOUP,NODOWN,NOIN,NOOUT} flags set 2026-03-31T20:27:30.469 INFO:tasks.workunit.client.0.vm03.stdout:[WRN] OSD_FLAGS: 2 OSDs or CRUSH {nodes, device-classes} have {NOUP,NODOWN,NOIN,NOOUT} flags set 2026-03-31T20:27:30.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1663: test_mon_osd: ceph osd rm-nodown 0 1 2026-03-31T20:27:31.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1664: test_mon_osd: ceph health detail 2026-03-31T20:27:31.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1664: test_mon_osd: grep NODOWN 2026-03-31T20:27:31.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1666: test_mon_osd: ceph osd out 0 2026-03-31T20:27:31.625 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:27:31.622+0000 7f8708384640 -1 mon.a@0(leader).osd e227 definitely_dead 0 2026-03-31T20:27:32.205 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:27:32.198+0000 7f8708384640 -1 mon.a@0(leader).osd e228 definitely_dead 0 2026-03-31T20:27:32.205 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 is already out. 2026-03-31T20:27:32.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1667: test_mon_osd: ceph osd add-noin 0 2026-03-31T20:27:33.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1668: test_mon_osd: ceph health detail 2026-03-31T20:27:33.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1668: test_mon_osd: grep NOIN 2026-03-31T20:27:33.503 INFO:tasks.workunit.client.0.vm03.stdout:HEALTH_WARN 1 OSDs or CRUSH {nodes, device-classes} have {NOUP,NODOWN,NOIN,NOOUT} flags set 2026-03-31T20:27:33.503 INFO:tasks.workunit.client.0.vm03.stdout:[WRN] OSD_FLAGS: 1 OSDs or CRUSH {nodes, device-classes} have {NOUP,NODOWN,NOIN,NOOUT} flags set 2026-03-31T20:27:33.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1669: test_mon_osd: ceph osd rm-noin 0 2026-03-31T20:27:34.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1670: test_mon_osd: ceph health detail 2026-03-31T20:27:34.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1670: test_mon_osd: grep NOIN 2026-03-31T20:27:34.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1671: test_mon_osd: ceph osd in 0 2026-03-31T20:27:34.653 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:27:34.646+0000 7f8708384640 -1 mon.a@0(leader).osd e230 definitely_dead 0 2026-03-31T20:27:35.234 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:27:35.230+0000 7f8708384640 -1 mon.a@0(leader).osd e231 definitely_dead 0 2026-03-31T20:27:35.234 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 is already in. 2026-03-31T20:27:35.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1673: test_mon_osd: ceph osd add-noout 0 2026-03-31T20:27:36.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1674: test_mon_osd: ceph health detail 2026-03-31T20:27:36.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1674: test_mon_osd: grep NOOUT 2026-03-31T20:27:36.520 INFO:tasks.workunit.client.0.vm03.stdout:HEALTH_WARN 1 OSDs or CRUSH {nodes, device-classes} have {NOUP,NODOWN,NOIN,NOOUT} flags set 2026-03-31T20:27:36.520 INFO:tasks.workunit.client.0.vm03.stdout:[WRN] OSD_FLAGS: 1 OSDs or CRUSH {nodes, device-classes} have {NOUP,NODOWN,NOIN,NOOUT} flags set 2026-03-31T20:27:36.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1675: test_mon_osd: ceph osd rm-noout 0 2026-03-31T20:27:37.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1676: test_mon_osd: ceph health detail 2026-03-31T20:27:37.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1676: test_mon_osd: grep NOOUT 2026-03-31T20:27:37.522 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1679: test_mon_osd: expect_false ceph osd add-noup 797er 2026-03-31T20:27:37.522 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:37.522 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd add-noup 797er 2026-03-31T20:27:37.672 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: unable to parse osd id or crush node or device class: "797er". 2026-03-31T20:27:37.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:37.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1680: test_mon_osd: expect_false ceph osd add-nodown u9uwer 2026-03-31T20:27:37.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:37.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd add-nodown u9uwer 2026-03-31T20:27:37.825 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: unable to parse osd id or crush node or device class: "u9uwer". 2026-03-31T20:27:37.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:37.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1681: test_mon_osd: expect_false ceph osd add-noin 78~15 2026-03-31T20:27:37.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:37.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd add-noin 78~15 2026-03-31T20:27:37.979 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: unable to parse osd id or crush node or device class: "78~15". 2026-03-31T20:27:37.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:37.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1683: test_mon_osd: expect_false ceph osd rm-noup 1234567 2026-03-31T20:27:37.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:37.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd rm-noup 1234567 2026-03-31T20:27:38.130 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: unable to parse osd id or crush node or device class: "1234567". 2026-03-31T20:27:38.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:38.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1684: test_mon_osd: expect_false ceph osd rm-nodown fsadf7 2026-03-31T20:27:38.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:38.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd rm-nodown fsadf7 2026-03-31T20:27:38.287 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: unable to parse osd id or crush node or device class: "fsadf7". 2026-03-31T20:27:38.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:38.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1685: test_mon_osd: expect_false ceph osd rm-noout 790-fd 2026-03-31T20:27:38.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:38.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd rm-noout 790-fd 2026-03-31T20:27:38.439 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: unable to parse osd id or crush node or device class: "790-fd". 2026-03-31T20:27:38.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:38.443 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1687: test_mon_osd: ceph osd ls-tree default 2026-03-31T20:27:38.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1687: test_mon_osd: ids='0 2026-03-31T20:27:38.646 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-31T20:27:38.646 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-31T20:27:38.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1688: test_mon_osd: for osd in $ids 2026-03-31T20:27:38.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1690: test_mon_osd: ceph osd add-nodown 0 2026-03-31T20:27:39.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1691: test_mon_osd: ceph osd add-noout 0 2026-03-31T20:27:40.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1688: test_mon_osd: for osd in $ids 2026-03-31T20:27:40.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1690: test_mon_osd: ceph osd add-nodown 1 2026-03-31T20:27:41.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1691: test_mon_osd: ceph osd add-noout 1 2026-03-31T20:27:42.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1688: test_mon_osd: for osd in $ids 2026-03-31T20:27:42.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1690: test_mon_osd: ceph osd add-nodown 2 2026-03-31T20:27:43.312 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1691: test_mon_osd: ceph osd add-noout 2 2026-03-31T20:27:44.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1693: test_mon_osd: ceph -s 2026-03-31T20:27:44.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1693: test_mon_osd: grep NODOWN 2026-03-31T20:27:44.580 INFO:tasks.workunit.client.0.vm03.stdout: 3 OSDs or CRUSH {nodes, device-classes} have {NOUP,NODOWN,NOIN,NOOUT} flags set 2026-03-31T20:27:44.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1694: test_mon_osd: ceph -s 2026-03-31T20:27:44.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1694: test_mon_osd: grep NOOUT 2026-03-31T20:27:44.852 INFO:tasks.workunit.client.0.vm03.stdout: 3 OSDs or CRUSH {nodes, device-classes} have {NOUP,NODOWN,NOIN,NOOUT} flags set 2026-03-31T20:27:44.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1695: test_mon_osd: ceph osd rm-nodown any 2026-03-31T20:27:45.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1696: test_mon_osd: ceph osd rm-noout all 2026-03-31T20:27:46.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1697: test_mon_osd: ceph -s 2026-03-31T20:27:46.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1697: test_mon_osd: grep NODOWN 2026-03-31T20:27:46.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1698: test_mon_osd: ceph -s 2026-03-31T20:27:46.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1698: test_mon_osd: grep NOOUT 2026-03-31T20:27:46.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1701: test_mon_osd: ceph osd add-noup osd.0 2026-03-31T20:27:47.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1702: test_mon_osd: ceph osd add-nodown osd.0 2026-03-31T20:27:48.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1703: test_mon_osd: ceph osd add-noin osd.0 2026-03-31T20:27:49.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1704: test_mon_osd: ceph osd add-noout osd.0 2026-03-31T20:27:50.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1705: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:27:50.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1705: test_mon_osd: jq .crush_node_flags 2026-03-31T20:27:50.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1705: test_mon_osd: expect_false grep osd.0 2026-03-31T20:27:50.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:50.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep osd.0 2026-03-31T20:27:50.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:50.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1706: test_mon_osd: ceph osd rm-noup osd.0 2026-03-31T20:27:51.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1707: test_mon_osd: ceph osd rm-nodown osd.0 2026-03-31T20:27:52.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1708: test_mon_osd: ceph osd rm-noin osd.0 2026-03-31T20:27:53.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1709: test_mon_osd: ceph osd rm-noout osd.0 2026-03-31T20:27:54.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1710: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:27:54.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1710: test_mon_osd: jq .crush_node_flags 2026-03-31T20:27:54.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1710: test_mon_osd: expect_false grep osd.0 2026-03-31T20:27:54.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:27:54.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep osd.0 2026-03-31T20:27:54.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:27:54.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1712: test_mon_osd: ceph osd crush add-bucket foo host root=default 2026-03-31T20:27:55.383 INFO:tasks.workunit.client.0.vm03.stderr:bucket 'foo' already exists 2026-03-31T20:27:55.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1713: test_mon_osd: ceph osd add-noup foo 2026-03-31T20:27:57.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1714: test_mon_osd: ceph osd add-nodown foo 2026-03-31T20:27:59.373 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1715: test_mon_osd: ceph osd add-noin foo 2026-03-31T20:28:01.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1716: test_mon_osd: ceph osd add-noout foo 2026-03-31T20:28:03.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1717: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:03.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1717: test_mon_osd: jq .crush_node_flags 2026-03-31T20:28:03.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1717: test_mon_osd: grep foo 2026-03-31T20:28:03.614 INFO:tasks.workunit.client.0.vm03.stdout: "foo": [ 2026-03-31T20:28:03.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1718: test_mon_osd: ceph osd rm-noup foo 2026-03-31T20:28:05.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1719: test_mon_osd: ceph osd rm-nodown foo 2026-03-31T20:28:07.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1720: test_mon_osd: ceph osd rm-noin foo 2026-03-31T20:28:09.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1721: test_mon_osd: ceph osd rm-noout foo 2026-03-31T20:28:11.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1722: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:11.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1722: test_mon_osd: jq .crush_node_flags 2026-03-31T20:28:11.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1722: test_mon_osd: expect_false grep foo 2026-03-31T20:28:11.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:28:11.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep foo 2026-03-31T20:28:11.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:28:11.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1723: test_mon_osd: ceph osd add-noup foo 2026-03-31T20:28:13.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1724: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:13.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1724: test_mon_osd: jq .crush_node_flags 2026-03-31T20:28:13.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1724: test_mon_osd: grep foo 2026-03-31T20:28:13.697 INFO:tasks.workunit.client.0.vm03.stdout: "foo": [ 2026-03-31T20:28:13.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1725: test_mon_osd: ceph osd crush rm foo 2026-03-31T20:28:14.538 INFO:tasks.workunit.client.0.vm03.stderr:device 'foo' does not appear in the crush map 2026-03-31T20:28:14.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1726: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:14.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1726: test_mon_osd: jq .crush_node_flags 2026-03-31T20:28:14.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1726: test_mon_osd: expect_false grep foo 2026-03-31T20:28:14.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:28:14.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep foo 2026-03-31T20:28:14.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:28:14.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1728: test_mon_osd: ceph osd set-group noup osd.0 2026-03-31T20:28:15.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1729: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:15.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1729: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:15.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1729: test_mon_osd: grep noup 2026-03-31T20:28:15.760 INFO:tasks.workunit.client.0.vm03.stdout: "noup", 2026-03-31T20:28:15.760 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1730: test_mon_osd: ceph osd set-group noup,nodown osd.0 2026-03-31T20:28:16.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1731: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:16.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1731: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:16.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1731: test_mon_osd: grep noup 2026-03-31T20:28:16.768 INFO:tasks.workunit.client.0.vm03.stdout: "noup", 2026-03-31T20:28:16.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1732: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:16.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1732: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:16.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1732: test_mon_osd: grep nodown 2026-03-31T20:28:16.977 INFO:tasks.workunit.client.0.vm03.stdout: "nodown", 2026-03-31T20:28:16.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1733: test_mon_osd: ceph osd set-group noup,nodown,noin osd.0 2026-03-31T20:28:17.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1734: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:17.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1734: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:17.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1734: test_mon_osd: grep noup 2026-03-31T20:28:17.772 INFO:tasks.workunit.client.0.vm03.stdout: "noup", 2026-03-31T20:28:17.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1735: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:17.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1735: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:17.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1735: test_mon_osd: grep nodown 2026-03-31T20:28:17.976 INFO:tasks.workunit.client.0.vm03.stdout: "nodown", 2026-03-31T20:28:17.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1736: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:17.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1736: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:17.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1736: test_mon_osd: grep noin 2026-03-31T20:28:18.181 INFO:tasks.workunit.client.0.vm03.stdout: "noin", 2026-03-31T20:28:18.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1737: test_mon_osd: ceph osd set-group noup,nodown,noin,noout osd.0 2026-03-31T20:28:18.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1738: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:18.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1738: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:18.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1738: test_mon_osd: grep noup 2026-03-31T20:28:18.777 INFO:tasks.workunit.client.0.vm03.stdout: "noup", 2026-03-31T20:28:18.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1739: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:18.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1739: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:18.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1739: test_mon_osd: grep nodown 2026-03-31T20:28:18.981 INFO:tasks.workunit.client.0.vm03.stdout: "nodown", 2026-03-31T20:28:18.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1740: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:18.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1740: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:18.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1740: test_mon_osd: grep noin 2026-03-31T20:28:19.202 INFO:tasks.workunit.client.0.vm03.stdout: "noin", 2026-03-31T20:28:19.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1741: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:19.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1741: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:19.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1741: test_mon_osd: grep noout 2026-03-31T20:28:19.405 INFO:tasks.workunit.client.0.vm03.stdout: "noout", 2026-03-31T20:28:19.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1742: test_mon_osd: ceph osd unset-group noup osd.0 2026-03-31T20:28:20.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1743: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:20.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1743: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:20.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1743: test_mon_osd: expect_false grep noup 2026-03-31T20:28:20.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:28:20.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep noup 2026-03-31T20:28:20.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:28:20.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1744: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:20.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1744: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:20.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1744: test_mon_osd: grep nodown 2026-03-31T20:28:20.989 INFO:tasks.workunit.client.0.vm03.stdout: "nodown", 2026-03-31T20:28:20.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1745: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:20.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1745: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:20.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1745: test_mon_osd: grep noin 2026-03-31T20:28:21.193 INFO:tasks.workunit.client.0.vm03.stdout: "noin", 2026-03-31T20:28:21.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1746: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:21.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1746: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:21.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1746: test_mon_osd: grep noout 2026-03-31T20:28:21.397 INFO:tasks.workunit.client.0.vm03.stdout: "noout", 2026-03-31T20:28:21.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1747: test_mon_osd: ceph osd unset-group noup,nodown osd.0 2026-03-31T20:28:22.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1748: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:22.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1748: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:22.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1748: test_mon_osd: expect_false grep 'noup\|nodown' 2026-03-31T20:28:22.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:28:22.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep 'noup\|nodown' 2026-03-31T20:28:22.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:28:22.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1749: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:22.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1749: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:22.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1749: test_mon_osd: grep noin 2026-03-31T20:28:23.003 INFO:tasks.workunit.client.0.vm03.stdout: "noin", 2026-03-31T20:28:23.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1750: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:23.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1750: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:23.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1750: test_mon_osd: grep noout 2026-03-31T20:28:23.206 INFO:tasks.workunit.client.0.vm03.stdout: "noout", 2026-03-31T20:28:23.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1751: test_mon_osd: ceph osd unset-group noup,nodown,noin osd.0 2026-03-31T20:28:23.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1752: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:23.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1752: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:23.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1752: test_mon_osd: expect_false grep 'noup\|nodown\|noin' 2026-03-31T20:28:23.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:28:23.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep 'noup\|nodown\|noin' 2026-03-31T20:28:23.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:28:23.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1753: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:23.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1753: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:23.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1753: test_mon_osd: grep noout 2026-03-31T20:28:24.016 INFO:tasks.workunit.client.0.vm03.stdout: "noout", 2026-03-31T20:28:24.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1754: test_mon_osd: ceph osd unset-group noup,nodown,noin,noout osd.0 2026-03-31T20:28:24.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1755: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:24.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1755: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:24.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1755: test_mon_osd: expect_false grep 'noup\|nodown\|noin\|noout' 2026-03-31T20:28:24.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:28:24.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep 'noup\|nodown\|noin\|noout' 2026-03-31T20:28:24.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:28:24.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1757: test_mon_osd: ceph osd set-group noup,nodown,noin,noout osd.0 osd.1 2026-03-31T20:28:25.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1758: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:25.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1758: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:25.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1758: test_mon_osd: grep noup 2026-03-31T20:28:25.823 INFO:tasks.workunit.client.0.vm03.stdout: "noup", 2026-03-31T20:28:25.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1759: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:25.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1759: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:25.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1759: test_mon_osd: grep nodown 2026-03-31T20:28:26.027 INFO:tasks.workunit.client.0.vm03.stdout: "nodown", 2026-03-31T20:28:26.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1760: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:26.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1760: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:26.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1760: test_mon_osd: grep noin 2026-03-31T20:28:26.229 INFO:tasks.workunit.client.0.vm03.stdout: "noin", 2026-03-31T20:28:26.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1761: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:26.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1761: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:26.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1761: test_mon_osd: grep noout 2026-03-31T20:28:26.431 INFO:tasks.workunit.client.0.vm03.stdout: "noout", 2026-03-31T20:28:26.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1762: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:26.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1762: test_mon_osd: jq '.osds[1].state' 2026-03-31T20:28:26.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1762: test_mon_osd: grep noup 2026-03-31T20:28:26.636 INFO:tasks.workunit.client.0.vm03.stdout: "noup", 2026-03-31T20:28:26.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1763: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:26.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1763: test_mon_osd: jq '.osds[1].state' 2026-03-31T20:28:26.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1763: test_mon_osd: grep nodown 2026-03-31T20:28:26.838 INFO:tasks.workunit.client.0.vm03.stdout: "nodown", 2026-03-31T20:28:26.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1764: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:26.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1764: test_mon_osd: jq '.osds[1].state' 2026-03-31T20:28:26.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1764: test_mon_osd: grep noin 2026-03-31T20:28:27.043 INFO:tasks.workunit.client.0.vm03.stdout: "noin", 2026-03-31T20:28:27.044 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1765: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:27.044 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1765: test_mon_osd: jq '.osds[1].state' 2026-03-31T20:28:27.044 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1765: test_mon_osd: grep noout 2026-03-31T20:28:27.247 INFO:tasks.workunit.client.0.vm03.stdout: "noout", 2026-03-31T20:28:27.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1766: test_mon_osd: ceph osd unset-group noup,nodown,noin,noout osd.0 osd.1 2026-03-31T20:28:27.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1767: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:27.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1767: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:27.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1767: test_mon_osd: expect_false grep 'noup\|nodown\|noin\|noout' 2026-03-31T20:28:27.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:28:27.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep 'noup\|nodown\|noin\|noout' 2026-03-31T20:28:27.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:28:27.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1768: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:27.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1768: test_mon_osd: jq '.osds[1].state' 2026-03-31T20:28:27.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1768: test_mon_osd: expect_false grep 'noup\|nodown\|noin\|noout' 2026-03-31T20:28:27.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:28:27.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep 'noup\|nodown\|noin\|noout' 2026-03-31T20:28:28.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:28:28.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1770: test_mon_osd: ceph osd set-group noup all 2026-03-31T20:28:28.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1771: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:28.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1771: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:28.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1771: test_mon_osd: grep noup 2026-03-31T20:28:28.846 INFO:tasks.workunit.client.0.vm03.stdout: "noup", 2026-03-31T20:28:28.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1772: test_mon_osd: ceph osd unset-group noup all 2026-03-31T20:28:29.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1773: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:29.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1773: test_mon_osd: jq '.osds[0].state' 2026-03-31T20:28:29.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1773: test_mon_osd: expect_false grep noup 2026-03-31T20:28:29.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:28:29.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep noup 2026-03-31T20:28:29.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:28:29.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1776: test_mon_osd: ceph osd crush add-bucket foo host root=default 2026-03-31T20:28:30.630 INFO:tasks.workunit.client.0.vm03.stderr:bucket 'foo' already exists 2026-03-31T20:28:30.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1777: test_mon_osd: ceph osd set-group noup foo 2026-03-31T20:28:32.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1778: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:32.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1778: test_mon_osd: jq .crush_node_flags.foo 2026-03-31T20:28:32.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1778: test_mon_osd: grep noup 2026-03-31T20:28:32.820 INFO:tasks.workunit.client.0.vm03.stdout: "noup" 2026-03-31T20:28:32.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1779: test_mon_osd: ceph osd set-group noup,nodown foo 2026-03-31T20:28:34.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1780: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:34.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1780: test_mon_osd: jq .crush_node_flags.foo 2026-03-31T20:28:34.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1780: test_mon_osd: grep noup 2026-03-31T20:28:34.827 INFO:tasks.workunit.client.0.vm03.stdout: "noup" 2026-03-31T20:28:34.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1781: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:34.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1781: test_mon_osd: jq .crush_node_flags.foo 2026-03-31T20:28:34.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1781: test_mon_osd: grep nodown 2026-03-31T20:28:35.032 INFO:tasks.workunit.client.0.vm03.stdout: "nodown", 2026-03-31T20:28:35.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1782: test_mon_osd: ceph osd set-group noup,nodown,noin foo 2026-03-31T20:28:36.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1783: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:36.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1783: test_mon_osd: jq .crush_node_flags.foo 2026-03-31T20:28:36.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1783: test_mon_osd: grep noup 2026-03-31T20:28:36.847 INFO:tasks.workunit.client.0.vm03.stdout: "noup" 2026-03-31T20:28:36.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1784: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:36.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1784: test_mon_osd: jq .crush_node_flags.foo 2026-03-31T20:28:36.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1784: test_mon_osd: grep nodown 2026-03-31T20:28:37.052 INFO:tasks.workunit.client.0.vm03.stdout: "nodown", 2026-03-31T20:28:37.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1785: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:37.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1785: test_mon_osd: jq .crush_node_flags.foo 2026-03-31T20:28:37.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1785: test_mon_osd: grep noin 2026-03-31T20:28:37.254 INFO:tasks.workunit.client.0.vm03.stdout: "noin", 2026-03-31T20:28:37.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1786: test_mon_osd: ceph osd set-group noup,nodown,noin,noout foo 2026-03-31T20:28:38.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1787: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:38.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1787: test_mon_osd: jq .crush_node_flags.foo 2026-03-31T20:28:38.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1787: test_mon_osd: grep noup 2026-03-31T20:28:38.860 INFO:tasks.workunit.client.0.vm03.stdout: "noup" 2026-03-31T20:28:38.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1788: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:38.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1788: test_mon_osd: jq .crush_node_flags.foo 2026-03-31T20:28:38.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1788: test_mon_osd: grep nodown 2026-03-31T20:28:39.067 INFO:tasks.workunit.client.0.vm03.stdout: "nodown", 2026-03-31T20:28:39.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1789: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:39.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1789: test_mon_osd: jq .crush_node_flags.foo 2026-03-31T20:28:39.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1789: test_mon_osd: grep noin 2026-03-31T20:28:39.271 INFO:tasks.workunit.client.0.vm03.stdout: "noin", 2026-03-31T20:28:39.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1790: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:39.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1790: test_mon_osd: jq .crush_node_flags.foo 2026-03-31T20:28:39.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1790: test_mon_osd: grep noout 2026-03-31T20:28:39.477 INFO:tasks.workunit.client.0.vm03.stdout: "noout", 2026-03-31T20:28:39.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1792: test_mon_osd: ceph osd unset-group noup foo 2026-03-31T20:28:40.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1793: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:40.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1793: test_mon_osd: jq .crush_node_flags.foo 2026-03-31T20:28:40.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1793: test_mon_osd: expect_false grep noup 2026-03-31T20:28:40.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:28:40.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep noup 2026-03-31T20:28:40.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:28:40.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1794: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:40.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1794: test_mon_osd: jq .crush_node_flags.foo 2026-03-31T20:28:40.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1794: test_mon_osd: grep nodown 2026-03-31T20:28:41.081 INFO:tasks.workunit.client.0.vm03.stdout: "nodown", 2026-03-31T20:28:41.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1795: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:41.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1795: test_mon_osd: jq .crush_node_flags.foo 2026-03-31T20:28:41.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1795: test_mon_osd: grep noin 2026-03-31T20:28:41.283 INFO:tasks.workunit.client.0.vm03.stdout: "noin", 2026-03-31T20:28:41.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1796: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:41.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1796: test_mon_osd: jq .crush_node_flags.foo 2026-03-31T20:28:41.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1796: test_mon_osd: grep noout 2026-03-31T20:28:41.485 INFO:tasks.workunit.client.0.vm03.stdout: "noout" 2026-03-31T20:28:41.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1797: test_mon_osd: ceph osd unset-group noup,nodown foo 2026-03-31T20:28:42.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1798: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:42.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1798: test_mon_osd: jq .crush_node_flags.foo 2026-03-31T20:28:42.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1798: test_mon_osd: expect_false grep 'noup\|nodown' 2026-03-31T20:28:42.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:28:42.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep 'noup\|nodown' 2026-03-31T20:28:42.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:28:42.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1799: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:42.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1799: test_mon_osd: jq .crush_node_flags.foo 2026-03-31T20:28:42.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1799: test_mon_osd: grep noin 2026-03-31T20:28:43.096 INFO:tasks.workunit.client.0.vm03.stdout: "noin", 2026-03-31T20:28:43.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1800: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:43.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1800: test_mon_osd: jq .crush_node_flags.foo 2026-03-31T20:28:43.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1800: test_mon_osd: grep noout 2026-03-31T20:28:43.302 INFO:tasks.workunit.client.0.vm03.stdout: "noout" 2026-03-31T20:28:43.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1801: test_mon_osd: ceph osd unset-group noup,nodown,noin foo 2026-03-31T20:28:44.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1802: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:44.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1802: test_mon_osd: jq .crush_node_flags.foo 2026-03-31T20:28:44.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1802: test_mon_osd: expect_false grep 'noup\|nodown\|noin' 2026-03-31T20:28:44.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:28:44.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep 'noup\|nodown\|noin' 2026-03-31T20:28:44.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:28:44.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1803: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:44.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1803: test_mon_osd: jq .crush_node_flags.foo 2026-03-31T20:28:44.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1803: test_mon_osd: grep noout 2026-03-31T20:28:45.108 INFO:tasks.workunit.client.0.vm03.stdout: "noout" 2026-03-31T20:28:45.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1804: test_mon_osd: ceph osd unset-group noup,nodown,noin,noout foo 2026-03-31T20:28:46.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1805: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:46.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1805: test_mon_osd: jq .crush_node_flags.foo 2026-03-31T20:28:46.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1805: test_mon_osd: expect_false grep 'noup\|nodown\|noin\|noout' 2026-03-31T20:28:46.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:28:46.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep 'noup\|nodown\|noin\|noout' 2026-03-31T20:28:46.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:28:46.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1807: test_mon_osd: ceph osd set-group noin,noout foo 2026-03-31T20:28:48.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1808: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:48.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1808: test_mon_osd: jq .crush_node_flags.foo 2026-03-31T20:28:48.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1808: test_mon_osd: grep noin 2026-03-31T20:28:48.928 INFO:tasks.workunit.client.0.vm03.stdout: "noin", 2026-03-31T20:28:48.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1809: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:48.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1809: test_mon_osd: jq .crush_node_flags.foo 2026-03-31T20:28:48.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1809: test_mon_osd: grep noout 2026-03-31T20:28:49.131 INFO:tasks.workunit.client.0.vm03.stdout: "noout" 2026-03-31T20:28:49.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1810: test_mon_osd: ceph osd unset-group noin,noout foo 2026-03-31T20:28:50.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1811: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:50.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1811: test_mon_osd: jq .crush_node_flags 2026-03-31T20:28:50.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1811: test_mon_osd: expect_false grep foo 2026-03-31T20:28:50.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:28:50.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep foo 2026-03-31T20:28:50.944 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:28:50.944 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1813: test_mon_osd: ceph osd set-group noup,nodown,noin,noout foo 2026-03-31T20:28:52.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1814: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:52.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1814: test_mon_osd: jq .crush_node_flags.foo 2026-03-31T20:28:52.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1814: test_mon_osd: grep noup 2026-03-31T20:28:52.957 INFO:tasks.workunit.client.0.vm03.stdout: "noup" 2026-03-31T20:28:52.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1815: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:52.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1815: test_mon_osd: jq .crush_node_flags.foo 2026-03-31T20:28:52.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1815: test_mon_osd: grep nodown 2026-03-31T20:28:53.160 INFO:tasks.workunit.client.0.vm03.stdout: "nodown", 2026-03-31T20:28:53.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1816: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:53.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1816: test_mon_osd: grep noin 2026-03-31T20:28:53.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1816: test_mon_osd: jq .crush_node_flags.foo 2026-03-31T20:28:53.372 INFO:tasks.workunit.client.0.vm03.stdout: "noin", 2026-03-31T20:28:53.373 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1817: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:53.373 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1817: test_mon_osd: jq .crush_node_flags.foo 2026-03-31T20:28:53.373 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1817: test_mon_osd: grep noout 2026-03-31T20:28:53.575 INFO:tasks.workunit.client.0.vm03.stdout: "noout", 2026-03-31T20:28:53.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1818: test_mon_osd: ceph osd crush rm foo 2026-03-31T20:28:53.802 INFO:tasks.workunit.client.0.vm03.stderr:device 'foo' does not appear in the crush map 2026-03-31T20:28:53.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1819: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:53.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1819: test_mon_osd: jq .crush_node_flags 2026-03-31T20:28:53.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1819: test_mon_osd: expect_false grep foo 2026-03-31T20:28:53.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:28:53.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep foo 2026-03-31T20:28:54.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:28:54.018 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1822: test_mon_osd: ceph osd crush get-device-class osd.0 2026-03-31T20:28:54.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1822: test_mon_osd: osd_0_device_class=hdd 2026-03-31T20:28:54.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1823: test_mon_osd: ceph osd set-group noup hdd 2026-03-31T20:28:55.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1824: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:55.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1824: test_mon_osd: jq .device_class_flags.hdd 2026-03-31T20:28:55.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1824: test_mon_osd: grep noup 2026-03-31T20:28:55.978 INFO:tasks.workunit.client.0.vm03.stdout: "noup" 2026-03-31T20:28:55.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1825: test_mon_osd: ceph osd set-group noup,nodown hdd 2026-03-31T20:28:57.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1826: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:57.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1826: test_mon_osd: jq .device_class_flags.hdd 2026-03-31T20:28:57.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1826: test_mon_osd: grep noup 2026-03-31T20:28:57.997 INFO:tasks.workunit.client.0.vm03.stdout: "noup" 2026-03-31T20:28:57.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1827: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:57.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1827: test_mon_osd: jq .device_class_flags.hdd 2026-03-31T20:28:57.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1827: test_mon_osd: grep nodown 2026-03-31T20:28:58.200 INFO:tasks.workunit.client.0.vm03.stdout: "nodown", 2026-03-31T20:28:58.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1828: test_mon_osd: ceph osd set-group noup,nodown,noin hdd 2026-03-31T20:28:59.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1829: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:28:59.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1829: test_mon_osd: jq .device_class_flags.hdd 2026-03-31T20:28:59.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1829: test_mon_osd: grep noup 2026-03-31T20:29:00.020 INFO:tasks.workunit.client.0.vm03.stdout: "noup" 2026-03-31T20:29:00.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1830: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:29:00.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1830: test_mon_osd: jq .device_class_flags.hdd 2026-03-31T20:29:00.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1830: test_mon_osd: grep nodown 2026-03-31T20:29:00.229 INFO:tasks.workunit.client.0.vm03.stdout: "nodown", 2026-03-31T20:29:00.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1831: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:29:00.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1831: test_mon_osd: jq .device_class_flags.hdd 2026-03-31T20:29:00.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1831: test_mon_osd: grep noin 2026-03-31T20:29:00.438 INFO:tasks.workunit.client.0.vm03.stdout: "noin", 2026-03-31T20:29:00.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1832: test_mon_osd: ceph osd set-group noup,nodown,noin,noout hdd 2026-03-31T20:29:01.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1833: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:29:01.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1833: test_mon_osd: jq .device_class_flags.hdd 2026-03-31T20:29:01.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1833: test_mon_osd: grep noup 2026-03-31T20:29:02.028 INFO:tasks.workunit.client.0.vm03.stdout: "noup" 2026-03-31T20:29:02.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1834: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:29:02.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1834: test_mon_osd: jq .device_class_flags.hdd 2026-03-31T20:29:02.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1834: test_mon_osd: grep nodown 2026-03-31T20:29:02.234 INFO:tasks.workunit.client.0.vm03.stdout: "nodown", 2026-03-31T20:29:02.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1835: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:29:02.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1835: test_mon_osd: jq .device_class_flags.hdd 2026-03-31T20:29:02.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1835: test_mon_osd: grep noin 2026-03-31T20:29:02.438 INFO:tasks.workunit.client.0.vm03.stdout: "noin", 2026-03-31T20:29:02.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1836: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:29:02.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1836: test_mon_osd: jq .device_class_flags.hdd 2026-03-31T20:29:02.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1836: test_mon_osd: grep noout 2026-03-31T20:29:02.643 INFO:tasks.workunit.client.0.vm03.stdout: "noout", 2026-03-31T20:29:02.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1838: test_mon_osd: ceph osd unset-group noup hdd 2026-03-31T20:29:03.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1839: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:29:03.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1839: test_mon_osd: jq .device_class_flags.hdd 2026-03-31T20:29:03.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1839: test_mon_osd: expect_false grep noup 2026-03-31T20:29:03.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:29:03.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep noup 2026-03-31T20:29:04.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:29:04.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1840: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:29:04.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1840: test_mon_osd: jq .device_class_flags.hdd 2026-03-31T20:29:04.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1840: test_mon_osd: grep nodown 2026-03-31T20:29:04.256 INFO:tasks.workunit.client.0.vm03.stdout: "nodown", 2026-03-31T20:29:04.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1841: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:29:04.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1841: test_mon_osd: jq .device_class_flags.hdd 2026-03-31T20:29:04.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1841: test_mon_osd: grep noin 2026-03-31T20:29:04.464 INFO:tasks.workunit.client.0.vm03.stdout: "noin", 2026-03-31T20:29:04.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1842: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:29:04.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1842: test_mon_osd: jq .device_class_flags.hdd 2026-03-31T20:29:04.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1842: test_mon_osd: grep noout 2026-03-31T20:29:04.668 INFO:tasks.workunit.client.0.vm03.stdout: "noout" 2026-03-31T20:29:04.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1843: test_mon_osd: ceph osd unset-group noup,nodown hdd 2026-03-31T20:29:05.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1844: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:29:05.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1844: test_mon_osd: jq .device_class_flags.hdd 2026-03-31T20:29:05.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1844: test_mon_osd: expect_false grep 'noup\|nodown' 2026-03-31T20:29:05.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:29:05.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep 'noup\|nodown' 2026-03-31T20:29:06.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:29:06.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1845: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:29:06.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1845: test_mon_osd: jq .device_class_flags.hdd 2026-03-31T20:29:06.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1845: test_mon_osd: grep noin 2026-03-31T20:29:06.278 INFO:tasks.workunit.client.0.vm03.stdout: "noin", 2026-03-31T20:29:06.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1846: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:29:06.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1846: test_mon_osd: jq .device_class_flags.hdd 2026-03-31T20:29:06.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1846: test_mon_osd: grep noout 2026-03-31T20:29:06.479 INFO:tasks.workunit.client.0.vm03.stdout: "noout" 2026-03-31T20:29:06.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1847: test_mon_osd: ceph osd unset-group noup,nodown,noin hdd 2026-03-31T20:29:07.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1848: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:29:07.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1848: test_mon_osd: jq .device_class_flags.hdd 2026-03-31T20:29:07.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1848: test_mon_osd: expect_false grep 'noup\|nodown\|noin' 2026-03-31T20:29:07.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:29:07.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep 'noup\|nodown\|noin' 2026-03-31T20:29:08.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:29:08.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1849: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:29:08.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1849: test_mon_osd: jq .device_class_flags.hdd 2026-03-31T20:29:08.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1849: test_mon_osd: grep noout 2026-03-31T20:29:08.296 INFO:tasks.workunit.client.0.vm03.stdout: "noout" 2026-03-31T20:29:08.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1850: test_mon_osd: ceph osd unset-group noup,nodown,noin,noout hdd 2026-03-31T20:29:09.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1851: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:29:09.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1851: test_mon_osd: jq .device_class_flags.hdd 2026-03-31T20:29:09.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1851: test_mon_osd: expect_false grep 'noup\|nodown\|noin\|noout' 2026-03-31T20:29:09.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:29:09.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep 'noup\|nodown\|noin\|noout' 2026-03-31T20:29:10.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:29:10.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1853: test_mon_osd: ceph osd set-group noin,noout hdd 2026-03-31T20:29:11.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1854: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:29:11.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1854: test_mon_osd: jq .device_class_flags.hdd 2026-03-31T20:29:11.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1854: test_mon_osd: grep noin 2026-03-31T20:29:12.102 INFO:tasks.workunit.client.0.vm03.stdout: "noin", 2026-03-31T20:29:12.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1855: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:29:12.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1855: test_mon_osd: jq .device_class_flags.hdd 2026-03-31T20:29:12.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1855: test_mon_osd: grep noout 2026-03-31T20:29:12.304 INFO:tasks.workunit.client.0.vm03.stdout: "noout" 2026-03-31T20:29:12.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1856: test_mon_osd: ceph osd unset-group noin,noout hdd 2026-03-31T20:29:13.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1857: test_mon_osd: ceph osd dump -f json-pretty 2026-03-31T20:29:13.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1857: test_mon_osd: jq .crush_node_flags 2026-03-31T20:29:13.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1857: test_mon_osd: expect_false grep hdd 2026-03-31T20:29:13.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:29:13.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep hdd 2026-03-31T20:29:14.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:29:14.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1860: test_mon_osd: ceph osd reweight osd.0 .5 2026-03-31T20:29:15.915 INFO:tasks.workunit.client.0.vm03.stderr:reweighted osd.0 to 0.5 (8000) 2026-03-31T20:29:15.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1861: test_mon_osd: ceph osd dump 2026-03-31T20:29:15.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1861: test_mon_osd: grep '^osd.0' 2026-03-31T20:29:15.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1861: test_mon_osd: grep 'weight 0.5' 2026-03-31T20:29:16.135 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 0.5 up_from 221 up_thru 234 down_at 219 last_clean_interval [8,220) v2:192.168.123.103:6804/950776786 v2:192.168.123.103:6813/951776786 exists,up f3bff22c-a21c-470a-aec1-c158b49523b8 2026-03-31T20:29:16.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1862: test_mon_osd: ceph osd out 0 2026-03-31T20:29:16.286 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:29:16.282+0000 7f8708384640 -1 mon.a@0(leader).osd e328 definitely_dead 0 2026-03-31T20:29:16.969 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:29:16.966+0000 7f8708384640 -1 mon.a@0(leader).osd e329 definitely_dead 0 2026-03-31T20:29:16.970 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 is already out. 2026-03-31T20:29:16.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1863: test_mon_osd: ceph osd in 0 2026-03-31T20:29:17.131 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:29:17.126+0000 7f8708384640 -1 mon.a@0(leader).osd e329 definitely_dead 0 2026-03-31T20:29:17.977 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:29:17.974+0000 7f8708384640 -1 mon.a@0(leader).osd e330 definitely_dead 0 2026-03-31T20:29:17.978 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 is already in. 2026-03-31T20:29:17.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1864: test_mon_osd: ceph osd dump 2026-03-31T20:29:17.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1864: test_mon_osd: grep '^osd.0' 2026-03-31T20:29:17.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1864: test_mon_osd: grep 'weight 0.5' 2026-03-31T20:29:18.194 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 0.5 up_from 221 up_thru 234 down_at 219 last_clean_interval [8,220) v2:192.168.123.103:6804/950776786 v2:192.168.123.103:6813/951776786 exists,up f3bff22c-a21c-470a-aec1-c158b49523b8 2026-03-31T20:29:18.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1866: test_mon_osd: ceph osd getmap -o require_kraken_osds 2026-03-31T20:29:18.401 INFO:tasks.workunit.client.0.vm03.stderr:got osdmap epoch 330 2026-03-31T20:29:18.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1867: test_mon_osd: '[' -s require_kraken_osds ']' 2026-03-31T20:29:18.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1868: test_mon_osd: rm require_kraken_osds 2026-03-31T20:29:18.414 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1869: test_mon_osd: ceph osd getmaxosd 2026-03-31T20:29:18.414 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1869: test_mon_osd: sed -e 's/max_osd = //' -e 's/ in epoch.*//' 2026-03-31T20:29:18.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1869: test_mon_osd: save=3 2026-03-31T20:29:18.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1870: test_mon_osd: '[' 3 -gt 0 ']' 2026-03-31T20:29:18.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1871: test_mon_osd: ceph osd setmaxosd 2 2026-03-31T20:29:18.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1871: test_mon_osd: grep EBUSY 2026-03-31T20:29:18.766 INFO:tasks.workunit.client.0.vm03.stdout:Error EBUSY: cannot shrink max_osd to 2 because osd.2 (and possibly others) still in use 2026-03-31T20:29:18.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1872: test_mon_osd: ceph osd setmaxosd 10 2026-03-31T20:29:19.941 INFO:tasks.workunit.client.0.vm03.stderr:set new max_osd = 10 2026-03-31T20:29:19.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1873: test_mon_osd: ceph osd getmaxosd 2026-03-31T20:29:19.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1873: test_mon_osd: grep 'max_osd = 10' 2026-03-31T20:29:20.176 INFO:tasks.workunit.client.0.vm03.stdout:max_osd = 10 in epoch 332 2026-03-31T20:29:20.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1874: test_mon_osd: ceph osd setmaxosd 3 2026-03-31T20:29:21.951 INFO:tasks.workunit.client.0.vm03.stderr:set new max_osd = 3 2026-03-31T20:29:21.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1875: test_mon_osd: ceph osd getmaxosd 2026-03-31T20:29:21.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1875: test_mon_osd: grep 'max_osd = 3' 2026-03-31T20:29:22.188 INFO:tasks.workunit.client.0.vm03.stdout:max_osd = 3 in epoch 334 2026-03-31T20:29:22.188 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1877: test_mon_osd: ceph osd ls 2026-03-31T20:29:22.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1877: test_mon_osd: for id in `ceph osd ls` 2026-03-31T20:29:22.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1878: test_mon_osd: retry_eagain 5 map_enxio_to_eagain ceph tell osd.0 version 2026-03-31T20:29:22.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:58: retry_eagain: local max=5 2026-03-31T20:29:22.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:59: retry_eagain: shift 2026-03-31T20:29:22.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:60: retry_eagain: local status 2026-03-31T20:29:22.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:61: retry_eagain: local tmpfile=/tmp/cephtool.sYl/retry_eagain.26274 2026-03-31T20:29:22.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:62: retry_eagain: local count 2026-03-31T20:29:22.400 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:63: retry_eagain: seq 1 5 2026-03-31T20:29:22.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:63: retry_eagain: for count in $(seq 1 $max) 2026-03-31T20:29:22.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:64: retry_eagain: status=0 2026-03-31T20:29:22.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:65: retry_eagain: map_enxio_to_eagain ceph tell osd.0 version 2026-03-31T20:29:22.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:66: retry_eagain: test 0 = 0 2026-03-31T20:29:22.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:68: retry_eagain: break 2026-03-31T20:29:22.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:72: retry_eagain: test 1 = 5 2026-03-31T20:29:22.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:75: retry_eagain: cat /tmp/cephtool.sYl/retry_eagain.26274 2026-03-31T20:29:22.499 INFO:tasks.workunit.client.0.vm03.stdout:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:88: map_enxio_to_eagain: local status=0 2026-03-31T20:29:22.499 INFO:tasks.workunit.client.0.vm03.stdout:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:89: map_enxio_to_eagain: local tmpfile=/tmp/cephtool.sYl/map_enxio_to_eagain.26274 2026-03-31T20:29:22.499 INFO:tasks.workunit.client.0.vm03.stdout:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:91: map_enxio_to_eagain: ceph tell osd.0 version 2026-03-31T20:29:22.499 INFO:tasks.workunit.client.0.vm03.stdout:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:92: map_enxio_to_eagain: test 0 '!=' 0 2026-03-31T20:29:22.499 INFO:tasks.workunit.client.0.vm03.stdout:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:96: map_enxio_to_eagain: cat /tmp/cephtool.sYl/map_enxio_to_eagain.26274 2026-03-31T20:29:22.499 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:29:22.499 INFO:tasks.workunit.client.0.vm03.stdout: "version": "20.2.0-721-g5bb32787", 2026-03-31T20:29:22.499 INFO:tasks.workunit.client.0.vm03.stdout: "release": "tentacle", 2026-03-31T20:29:22.499 INFO:tasks.workunit.client.0.vm03.stdout: "release_type": "stable" 2026-03-31T20:29:22.499 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:29:22.499 INFO:tasks.workunit.client.0.vm03.stdout:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:97: map_enxio_to_eagain: rm /tmp/cephtool.sYl/map_enxio_to_eagain.26274 2026-03-31T20:29:22.499 INFO:tasks.workunit.client.0.vm03.stdout:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:98: map_enxio_to_eagain: return 0 2026-03-31T20:29:22.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:76: retry_eagain: rm /tmp/cephtool.sYl/retry_eagain.26274 2026-03-31T20:29:22.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:77: retry_eagain: return 0 2026-03-31T20:29:22.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1877: test_mon_osd: for id in `ceph osd ls` 2026-03-31T20:29:22.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1878: test_mon_osd: retry_eagain 5 map_enxio_to_eagain ceph tell osd.1 version 2026-03-31T20:29:22.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:58: retry_eagain: local max=5 2026-03-31T20:29:22.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:59: retry_eagain: shift 2026-03-31T20:29:22.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:60: retry_eagain: local status 2026-03-31T20:29:22.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:61: retry_eagain: local tmpfile=/tmp/cephtool.sYl/retry_eagain.26274 2026-03-31T20:29:22.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:62: retry_eagain: local count 2026-03-31T20:29:22.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:63: retry_eagain: seq 1 5 2026-03-31T20:29:22.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:63: retry_eagain: for count in $(seq 1 $max) 2026-03-31T20:29:22.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:64: retry_eagain: status=0 2026-03-31T20:29:22.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:65: retry_eagain: map_enxio_to_eagain ceph tell osd.1 version 2026-03-31T20:29:22.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:66: retry_eagain: test 0 = 0 2026-03-31T20:29:22.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:68: retry_eagain: break 2026-03-31T20:29:22.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:72: retry_eagain: test 1 = 5 2026-03-31T20:29:22.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:75: retry_eagain: cat /tmp/cephtool.sYl/retry_eagain.26274 2026-03-31T20:29:22.593 INFO:tasks.workunit.client.0.vm03.stdout:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:88: map_enxio_to_eagain: local status=0 2026-03-31T20:29:22.593 INFO:tasks.workunit.client.0.vm03.stdout:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:89: map_enxio_to_eagain: local tmpfile=/tmp/cephtool.sYl/map_enxio_to_eagain.26274 2026-03-31T20:29:22.593 INFO:tasks.workunit.client.0.vm03.stdout:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:91: map_enxio_to_eagain: ceph tell osd.1 version 2026-03-31T20:29:22.593 INFO:tasks.workunit.client.0.vm03.stdout:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:92: map_enxio_to_eagain: test 0 '!=' 0 2026-03-31T20:29:22.593 INFO:tasks.workunit.client.0.vm03.stdout:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:96: map_enxio_to_eagain: cat /tmp/cephtool.sYl/map_enxio_to_eagain.26274 2026-03-31T20:29:22.593 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:29:22.593 INFO:tasks.workunit.client.0.vm03.stdout: "version": "20.2.0-721-g5bb32787", 2026-03-31T20:29:22.593 INFO:tasks.workunit.client.0.vm03.stdout: "release": "tentacle", 2026-03-31T20:29:22.593 INFO:tasks.workunit.client.0.vm03.stdout: "release_type": "stable" 2026-03-31T20:29:22.593 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:29:22.593 INFO:tasks.workunit.client.0.vm03.stdout:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:97: map_enxio_to_eagain: rm /tmp/cephtool.sYl/map_enxio_to_eagain.26274 2026-03-31T20:29:22.593 INFO:tasks.workunit.client.0.vm03.stdout:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:98: map_enxio_to_eagain: return 0 2026-03-31T20:29:22.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:76: retry_eagain: rm /tmp/cephtool.sYl/retry_eagain.26274 2026-03-31T20:29:22.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:77: retry_eagain: return 0 2026-03-31T20:29:22.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1877: test_mon_osd: for id in `ceph osd ls` 2026-03-31T20:29:22.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1878: test_mon_osd: retry_eagain 5 map_enxio_to_eagain ceph tell osd.2 version 2026-03-31T20:29:22.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:58: retry_eagain: local max=5 2026-03-31T20:29:22.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:59: retry_eagain: shift 2026-03-31T20:29:22.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:60: retry_eagain: local status 2026-03-31T20:29:22.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:61: retry_eagain: local tmpfile=/tmp/cephtool.sYl/retry_eagain.26274 2026-03-31T20:29:22.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:62: retry_eagain: local count 2026-03-31T20:29:22.594 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:63: retry_eagain: seq 1 5 2026-03-31T20:29:22.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:63: retry_eagain: for count in $(seq 1 $max) 2026-03-31T20:29:22.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:64: retry_eagain: status=0 2026-03-31T20:29:22.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:65: retry_eagain: map_enxio_to_eagain ceph tell osd.2 version 2026-03-31T20:29:22.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:66: retry_eagain: test 0 = 0 2026-03-31T20:29:22.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:68: retry_eagain: break 2026-03-31T20:29:22.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:72: retry_eagain: test 1 = 5 2026-03-31T20:29:22.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:75: retry_eagain: cat /tmp/cephtool.sYl/retry_eagain.26274 2026-03-31T20:29:22.688 INFO:tasks.workunit.client.0.vm03.stdout:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:88: map_enxio_to_eagain: local status=0 2026-03-31T20:29:22.688 INFO:tasks.workunit.client.0.vm03.stdout:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:89: map_enxio_to_eagain: local tmpfile=/tmp/cephtool.sYl/map_enxio_to_eagain.26274 2026-03-31T20:29:22.688 INFO:tasks.workunit.client.0.vm03.stdout:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:91: map_enxio_to_eagain: ceph tell osd.2 version 2026-03-31T20:29:22.688 INFO:tasks.workunit.client.0.vm03.stdout:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:92: map_enxio_to_eagain: test 0 '!=' 0 2026-03-31T20:29:22.688 INFO:tasks.workunit.client.0.vm03.stdout:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:96: map_enxio_to_eagain: cat /tmp/cephtool.sYl/map_enxio_to_eagain.26274 2026-03-31T20:29:22.688 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:29:22.688 INFO:tasks.workunit.client.0.vm03.stdout: "version": "20.2.0-721-g5bb32787", 2026-03-31T20:29:22.688 INFO:tasks.workunit.client.0.vm03.stdout: "release": "tentacle", 2026-03-31T20:29:22.688 INFO:tasks.workunit.client.0.vm03.stdout: "release_type": "stable" 2026-03-31T20:29:22.688 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:29:22.688 INFO:tasks.workunit.client.0.vm03.stdout:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:97: map_enxio_to_eagain: rm /tmp/cephtool.sYl/map_enxio_to_eagain.26274 2026-03-31T20:29:22.688 INFO:tasks.workunit.client.0.vm03.stdout:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:98: map_enxio_to_eagain: return 0 2026-03-31T20:29:22.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:76: retry_eagain: rm /tmp/cephtool.sYl/retry_eagain.26274 2026-03-31T20:29:22.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:77: retry_eagain: return 0 2026-03-31T20:29:22.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1881: test_mon_osd: ceph osd rm 0 2026-03-31T20:29:22.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1881: test_mon_osd: grep EBUSY 2026-03-31T20:29:22.838 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:29:22.834+0000 7f8708384640 -1 mon.a@0(leader).osd e334 definitely_dead 0 2026-03-31T20:29:22.842 INFO:tasks.workunit.client.0.vm03.stdout:Error EBUSY: osd.0 is still up; must be down before removal. 2026-03-31T20:29:22.842 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1883: test_mon_osd: ceph osd ls 2026-03-31T20:29:23.049 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1883: test_mon_osd: echo 0 1 2 2026-03-31T20:29:23.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1883: test_mon_osd: local 'old_osds=0 1 2' 2026-03-31T20:29:23.050 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1884: test_mon_osd: ceph osd create 2026-03-31T20:29:24.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1884: test_mon_osd: id=4 2026-03-31T20:29:24.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1885: test_mon_osd: ceph osd find 4 2026-03-31T20:29:25.180 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:29:25.180 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 4, 2026-03-31T20:29:25.180 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:29:25.180 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [] 2026-03-31T20:29:25.180 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:29:25.180 INFO:tasks.workunit.client.0.vm03.stdout: "osd_fsid": "00000000-0000-0000-0000-000000000000", 2026-03-31T20:29:25.181 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": {} 2026-03-31T20:29:25.181 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:29:25.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1886: test_mon_osd: ceph osd lost 4 --yes-i-really-mean-it 2026-03-31T20:29:26.981 INFO:tasks.workunit.client.0.vm03.stderr:marked osd lost in epoch 0 2026-03-31T20:29:26.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1887: test_mon_osd: expect_false ceph osd setmaxosd 4 2026-03-31T20:29:26.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:29:26.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd setmaxosd 4 2026-03-31T20:29:27.149 INFO:tasks.workunit.client.0.vm03.stderr:Error EBUSY: cannot shrink max_osd to 4 because osd.4 (and possibly others) still in use 2026-03-31T20:29:27.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:29:27.153 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1888: test_mon_osd: ceph osd ls 2026-03-31T20:29:27.357 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1888: test_mon_osd: echo 0 1 2 3 4 2026-03-31T20:29:27.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1888: test_mon_osd: local 'new_osds=0 1 2 3 4' 2026-03-31T20:29:27.358 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1889: test_mon_osd: echo 0 1 2 3 4 2026-03-31T20:29:27.358 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1889: test_mon_osd: sed -e 's/0 1 2//' 2026-03-31T20:29:27.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1889: test_mon_osd: for id in $(echo $new_osds | sed -e "s/$old_osds//") 2026-03-31T20:29:27.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1890: test_mon_osd: ceph osd rm 3 2026-03-31T20:29:27.513 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:29:27.506+0000 7f8708384640 -1 mon.a@0(leader).osd e338 definitely_dead 0 2026-03-31T20:29:28.051 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:29:28.046+0000 7f8708384640 -1 mon.a@0(leader).osd e339 definitely_dead 0 2026-03-31T20:29:28.051 INFO:tasks.workunit.client.0.vm03.stderr:osd.3 does not exist. 2026-03-31T20:29:28.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1889: test_mon_osd: for id in $(echo $new_osds | sed -e "s/$old_osds//") 2026-03-31T20:29:28.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1890: test_mon_osd: ceph osd rm 4 2026-03-31T20:29:28.214 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:29:28.210+0000 7f8708384640 -1 mon.a@0(leader).osd e339 definitely_dead 0 2026-03-31T20:29:29.060 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:29:29.054+0000 7f8708384640 -1 mon.a@0(leader).osd e340 definitely_dead 0 2026-03-31T20:29:29.061 INFO:tasks.workunit.client.0.vm03.stderr:osd.4 does not exist. 2026-03-31T20:29:29.072 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1893: test_mon_osd: uuidgen 2026-03-31T20:29:29.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1893: test_mon_osd: uuid=98628724-067b-48d0-ba8a-821b377a9176 2026-03-31T20:29:29.074 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1894: test_mon_osd: ceph osd create 98628724-067b-48d0-ba8a-821b377a9176 2026-03-31T20:29:30.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1894: test_mon_osd: id=3 2026-03-31T20:29:30.080 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1895: test_mon_osd: ceph osd create 98628724-067b-48d0-ba8a-821b377a9176 2026-03-31T20:29:30.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1895: test_mon_osd: id2=3 2026-03-31T20:29:30.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1896: test_mon_osd: '[' 3 = 3 ']' 2026-03-31T20:29:30.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1897: test_mon_osd: ceph osd rm 3 2026-03-31T20:29:30.446 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:29:30.442+0000 7f8708384640 -1 mon.a@0(leader).osd e341 definitely_dead 0 2026-03-31T20:29:31.072 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:29:31.066+0000 7f8708384640 -1 mon.a@0(leader).osd e342 definitely_dead 0 2026-03-31T20:29:31.072 INFO:tasks.workunit.client.0.vm03.stderr:osd.3 does not exist. 2026-03-31T20:29:31.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1899: test_mon_osd: ceph --help osd 2026-03-31T20:29:31.186 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: General usage: 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: ============== 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout:usage: ceph [-h] [-c CEPHCONF] [-i INPUT_FILE] [-o OUTPUT_FILE] 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: [--setuser SETUSER] [--setgroup SETGROUP] [--id CLIENT_ID] 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: [--name CLIENT_NAME] [--cluster CLUSTER] 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: [--admin-daemon ADMIN_SOCKET] [-s] [-w] [--watch-debug] 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: [--watch-info] [--watch-sec] [--watch-warn] [--watch-error] 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: [-W WATCH_CHANNEL] [--version] [--verbose] [--concise] 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: [--daemon-output-file DAEMON_OUTPUT_FILE] 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: [-f {json,json-pretty,xml,xml-pretty,plain,yaml}] 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: [--connect-timeout CLUSTER_TIMEOUT] [--block] [--period PERIOD] 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout:Ceph administration tool 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout:options: 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: -h, --help request mon help 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: -c CEPHCONF, --conf CEPHCONF 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: ceph configuration file 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: -i INPUT_FILE, --in-file INPUT_FILE 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: input file, or "-" for stdin 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: -o OUTPUT_FILE, --out-file OUTPUT_FILE 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: output file, or "-" for stdout 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: --setuser SETUSER set user file permission 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: --setgroup SETGROUP set group file permission 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: --id CLIENT_ID, --user CLIENT_ID 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: client id for authentication 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: --name CLIENT_NAME, -n CLIENT_NAME 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: client name for authentication 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: --cluster CLUSTER cluster name 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: --admin-daemon ADMIN_SOCKET 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: submit admin-socket command (e.g. "help" fora list of 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: available commands) 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: -s, --status show cluster status 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: -w, --watch watch live cluster changes 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: --watch-debug watch debug events 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: --watch-info watch info events 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: --watch-sec watch security events 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: --watch-warn watch warn events 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: --watch-error watch error events 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: -W WATCH_CHANNEL, --watch-channel WATCH_CHANNEL 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: watch live cluster changes on a specific channel 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: (e.g., cluster, audit, cephadm, or '*' for all) 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: --version, -v display version 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: --verbose make verbose 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: --concise make less verbose 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: --daemon-output-file DAEMON_OUTPUT_FILE 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: output file location local to the daemon for JSON 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: produced by tell commands 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: -f {json,json-pretty,xml,xml-pretty,plain,yaml}, --format {json,json-pretty,xml,xml-pretty,plain,yaml} 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: Note: yaml is only valid for orch commands 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: --connect-timeout CLUSTER_TIMEOUT 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: set a timeout for connecting to the cluster 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: --block block until completion (scrub and deep-scrub only) 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: --period PERIOD, -p PERIOD 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: polling period, default 1.0 second (for polling 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: commands only) 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: Local commands: 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: =============== 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:29:31.187 INFO:tasks.workunit.client.0.vm03.stdout:ping Send simple presence/life test to a mon 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: may be 'mon.*' for all mons 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:daemon {type.id|path} 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: Same as --admin-daemon, but auto-find admin socket 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:daemonperf {type.id | path} [stat-pats] [priority] [] [] 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:daemonperf {type.id | path} list|ls [stat-pats] [priority] 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: Get selected perf stats from daemon/admin socket 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: Optional shell-glob comma-delim match string stat-pats 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: Optional selection priority (can abbreviate name): 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: critical, interesting, useful, noninteresting, debug 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: List shows a table of all available stats 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: Run times (default forever), 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: once per seconds (default 1) 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: Monitor commands: 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: ================= 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd blocked-by print histogram of which OSDs are 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: blocking their peers 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd blocklist [] seconds 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: add|rm> [] from now) or remove from 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: blocklist 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd blocklist clear clear all blocklisted clients 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd blocklist ls show blocklisted clients 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd count-metadata count OSDs by metadata field property 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd crush add ... weight for with and 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: location 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd crush add-bucket add no-parent (probably root) crush 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: [...] bucket of type to 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: location 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd crush class create create crush device class 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd crush class ls list all crush device classes 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd crush class ls-osd list all osds belonging to the specific 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd crush class rename rename crush device class to 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd crush class rm remove crush device class 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd crush create-or-move create entry or move existing entry for 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: ... at/to location 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd crush dump dump crush map 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd crush get-device-class ... get classes of specified osd(s) 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: [...] 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd crush get-tunable 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: calc_version> 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd crush link ... link existing entry for under 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: location 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd crush ls list items beneath a node in the CRUSH 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: tree 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd crush move ... move existing entry for to 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: location 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd crush rename-bucket rename bucket to 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd crush reweight 's weight to in 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: float> crush map 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd crush reweight-all recalculate the weights for the tree to 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: ensure they sum correctly 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd crush reweight-subtree change all leaf items beneath to 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: in crush map 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd crush rm [] remove from crush map ( 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: everywhere, or just at ) 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd crush rm-device-class ... remove class of the osd(s) [... 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: ],or use to remove all. 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd crush rule create-erasure create crush rule for erasure 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: [] coded pool created with ( 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: default default) 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout:osd crush rule create-replicated create crush rule for replicated 2026-03-31T20:29:31.188 INFO:tasks.workunit.client.0.vm03.stdout: [] pool to start from , replicate 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: across buckets of type , use 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: devices of type (ssd or hdd) 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd crush rule create-simple create crush rule to start from 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: [] , replicate across buckets of 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: type , using a choose mode of 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: (default firstn; indep 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: best for erasure pools) 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd crush rule dump [] dump crush rule (default all) 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd crush rule ls list crush rules 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd crush rule ls-by-class list all crush rules that reference the 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: same 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd crush rule rename rename crush rule to 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd crush rule rm remove crush rule 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd crush set ... to with location 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd crush set [] set crush map from input file 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd crush set-all-straw-buckets-to- convert all CRUSH current straw buckets 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: straw2 to use the straw2 algorithm 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd crush set-device-class set the of the osd(s) 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: ... [...],or use to set all. 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd crush set-tunable to 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: calc_version> 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd crush show-tunables show current crush tunables 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd crush swap-bucket swap existing bucket contents from ( 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: [--yes-i-really-mean-it] orphan) bucket and 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd crush tree [--show-shadow] dump crush buckets and items in a tree 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: view 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd crush tunables 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: argonaut|bobtail|firefly|hammer|jewel| 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: optimal|default> 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd crush unlink [] unlink from crush map ( 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: everywhere, or just at ) 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd crush weight-set create create a weight-set for a given pool 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd crush weight-set create-compat create a default backward-compatible 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: weight-set 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd crush weight-set dump dump crush weight sets 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd crush weight-set ls list crush weight sets 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd crush weight-set reweight set weight for an item (bucket or osd) 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: ... in a pool's weight-set 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd crush weight-set reweight-compat set weight for an item (bucket or osd) 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: ... in the backward-compatible weight-set 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd crush weight-set rm remove the weight-set for a given pool 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd crush weight-set rm-compat remove the backward-compatible weight- 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: set 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd deep-scrub initiate deep scrub on osd , or 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: use to deep scrub all 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd destroy [--force] [-- mark osd as being destroyed. Keeps the 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: yes-i-really-mean-it] ID intact (allowing reuse), but 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: removes cephx keys, config-key data 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: and lockbox keys, rendering data 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: permanently unreadable. 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd df [] show OSD utilization 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: [] [] 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd down ... [--definitely-dead] set osd(s) [...] down, or use 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: to set all osds down 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd dump [] print summary of OSD map 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd erasure-code-profile get get erasure code profile 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd erasure-code-profile ls list all erasure code profiles 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd erasure-code-profile rm remove erasure code profile 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd erasure-code-profile set create erasure code profile with 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: [...] [--force] [--yes-i- [ ...] pairs. Add a -- 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: really-mean-it] force at the end to override an 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout: existing profile (VERY DANGEROUS) 2026-03-31T20:29:31.189 INFO:tasks.workunit.client.0.vm03.stdout:osd find find osd in the CRUSH map and show 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: its location 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd force-create-pg [--yes-i- force creation of pg 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: really-mean-it] 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd force_healthy_stretch_mode [--yes- force a healthy stretch mode, requiring 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: i-really-mean-it] the full number of CRUSH buckets to 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: peer and letting all non-tiebreaker 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: monitors be elected leader 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd force_recovery_stretch_mode [--yes- try and force a recovery stretch mode, 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: i-really-mean-it] increasing the pool size to its non- 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: failure value if currently degraded 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: and all monitor buckets are up 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd get-require-min-compat-client get the minimum client version we will 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: maintain compatibility with 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd getcrushmap [] get CRUSH map 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd getmap [] get OSD map 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd getmaxosd show largest OSD id 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd in ... set osd(s) [...] in, can use 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: to automatically set all 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: previously out osds in 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd info [] print osd's {id} information (instead 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: of all osds from map) 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd last-stat-seq get the last pg stats sequence number 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: reported for this osd 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd lost [--yes-i-really- mark osd as permanently lost. THIS 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: mean-it] DESTROYS DATA IF NO MORE REPLICAS 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: EXIST, BE CAREFUL 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd ls [] show all OSD ids 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd ls-tree [] show OSD ids under bucket in the 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: CRUSH map 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd map [] find pg for in with 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: [namespace] 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd metadata [] fetch metadata for osd {id} (default 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: all) 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd new [] Create a new OSD. If supplied, the `id` 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: to be replaced needs to exist and have 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: been previously destroyed. Reads 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: secrets from JSON file via `-i ` 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: (see man page). 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd numa-status show NUMA status of OSDs 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd ok-to-stop ... [] check whether osd(s) can be safely 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: stopped without reducing immediate 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: data availability 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd out ... set osd(s) [...] out, or use 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: to set all osds out 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd pause pause osd 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd perf print dump of OSD perf summary stats 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd perf counters get fetch osd perf counters 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd perf query add 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd perf query remove remove osd perf query 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd pg-temp [...] set pg_temp mapping :[ [.. 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: .]] (developers only) 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd pg-upmap ... set pg_upmap mapping :[ [. 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: ..]] (developers only) 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd pg-upmap-items .. set pg_upmap_items mapping :{ 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: . to , [...]} (developers only) 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd pg-upmap-primary set pg primary osd : (id (osd) 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: must be part of pgid) 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd pool application disable disables use of an application on 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: [--yes-i-really-mean-it] pool 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd pool application enable enable use of an application 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: [--yes-i-really-mean-it] [cephfs,rbd,rgw] on pool 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd pool application get [] get value of key of application 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: [] [] on pool 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout:osd pool application rm removes application metadata key 2026-03-31T20:29:31.190 INFO:tasks.workunit.client.0.vm03.stdout: on pool 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout:osd pool application set sets application metadata key 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: to on pool 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout:osd pool autoscale-status [--format report on pool pg_num sizing 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: ] recommendation and intent 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout:osd pool availability-status obtain availability stats from all pools 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout:osd pool cancel-force-backfill ... restore normal recovery priority of 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: specified pool 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout:osd pool cancel-force-recovery ... restore normal recovery priority of 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: specified pool 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout:osd pool clear-availability-status clear a pool's existing availability 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: stats 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout:osd pool create [] create pool 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: [] [] [] [] [] [] [] [] 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: [] [-- 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: bulk] [] 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: [] [--yes-i- 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: really-mean-it] 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout:osd pool deep-scrub ... initiate deep-scrub on pool 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout:osd pool force-backfill ... force backfill of specified pool 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: first 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout:osd pool force-recovery ... force recovery of specified pool 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: first 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout:osd pool force-remove-snap Forces removal of snapshots in the 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: [] [] [--dry-run] snapid_bound) on pool in order 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: to cause OSDs to re-trim them. 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout:osd pool get 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: pg_num|pgp_num|crush_rule|hashpspool| 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: nodelete|nopgchange|nosizechange| 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: write_fadvise_dontneed|noscrub|nodeep- 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: scrub|hit_set_type|hit_set_period|hit_ 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: set_count|hit_set_fpp|use_gmt_hitset| 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: target_max_objects|target_max_bytes| 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: cache_target_dirty_ratio|cache_target_ 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: dirty_high_ratio|cache_target_full_ 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: ratio|cache_min_flush_age|cache_min_ 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: evict_age|erasure_code_profile|min_ 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: read_recency_for_promote|all|min_ 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: write_recency_for_promote|fast_read| 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: hit_set_grade_decay_rate|hit_set_ 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: search_last_n|scrub_min_interval| 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: scrub_max_interval|deep_scrub_ 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: interval|recovery_priority|recovery_ 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: op_priority|scrub_priority| 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: compression_mode|compression_ 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: algorithm|compression_required_ratio| 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: compression_max_blob_size|compression_ 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: min_blob_size|csum_type|csum_min_ 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: block|csum_max_block|allow_ec_ 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: overwrites|fingerprint_algorithm|pg_ 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: autoscale_mode|pg_autoscale_bias|pg_ 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: num_min|pg_num_max|target_size_bytes| 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: target_size_ratio|dedup_tier|dedup_ 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: chunk_algorithm|dedup_cdc_chunk_size| 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: eio|bulk|read_ratio|pct_update_delay| 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: allow_ec_optimizations> 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout:osd pool get noautoscale Get the noautoscale flag to see if all 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: pools are setting the autoscaler on or 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: off as well as newly created pools in 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout: the future. 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout:osd pool get-quota obtain object or byte limits for pool 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout:osd pool ls [] list pools 2026-03-31T20:29:31.191 INFO:tasks.workunit.client.0.vm03.stdout:osd pool mksnap make snapshot in 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout:osd pool rename [- rename to 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: -yes-i-really-mean-it] 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout:osd pool repair ... initiate repair on pool 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout:osd pool rm [] [--yes-i- remove pool 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: really-really-mean-it] [--yes-i- 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: really-really-mean-it-not-faking] 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout:osd pool rmsnap remove snapshot from 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout:osd pool scrub ... initiate scrub on pool 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout:osd pool set to 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: pg_num|pgp_num|pgp_num_actual|crush_ 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: rule|hashpspool|nodelete|nopgchange| 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: nosizechange|write_fadvise_dontneed| 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: noscrub|nodeep-scrub|hit_set_type|hit_ 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: set_period|hit_set_count|hit_set_fpp| 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: use_gmt_hitset|target_max_bytes| 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: target_max_objects|cache_target_dirty_ 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: ratio|cache_target_dirty_high_ratio| 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: cache_target_full_ratio|cache_min_ 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: flush_age|cache_min_evict_age|min_ 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: read_recency_for_promote|min_write_ 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: recency_for_promote|fast_read|hit_set_ 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: grade_decay_rate|hit_set_search_last_ 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: n|scrub_min_interval|scrub_max_ 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: interval|deep_scrub_interval|recovery_ 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: priority|recovery_op_priority|scrub_ 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: priority|compression_mode|compression_ 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: algorithm|compression_required_ratio| 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: compression_max_blob_size|compression_ 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: min_blob_size|csum_type|csum_min_ 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: block|csum_max_block|allow_ec_ 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: overwrites|fingerprint_algorithm|pg_ 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: autoscale_mode|pg_autoscale_bias|pg_ 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: num_min|pg_num_max|target_size_bytes| 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: target_size_ratio|dedup_tier|dedup_ 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: chunk_algorithm|dedup_cdc_chunk_size| 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: eio|bulk|read_ratio|pct_update_delay| 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: allow_ec_optimizations> [--yes- 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: i-really-mean-it] 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout:osd pool set noautoscale set the noautoscale for all pools ( 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: including newly created pools in the 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: future) and complete all on-going 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: progress events regarding PG- 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: autoscaling. 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout:osd pool set threshold set the autoscaler threshold A.K.A. 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: the factor by which the new PG_NUM 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: must vary from the existing PG_NUM 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout:osd pool set-quota 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout:osd pool stats [] obtain stats from all pools, or from 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: specified pool 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout:osd pool stretch set 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: [--yes-i- 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: really-mean-it] 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout:osd pool stretch show show all the stretch related 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: information for the pool 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout:osd pool stretch unset 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout:osd pool unset noautoscale Unset the noautoscale flag so all pools 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: will go back to its previous mode. 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: Newly created pools in the future will 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: autoscaler on by default. 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout:osd primary-affinity adjust osd primary-affinity from 0.0 <= 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout: <= 1.0 2026-03-31T20:29:31.192 INFO:tasks.workunit.client.0.vm03.stdout:osd primary-temp set primary_temp mapping pgid: ( 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: developers only) 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd purge [--force] [--yes- purge all osd data from the monitors 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: i-really-mean-it] including the OSD id and CRUSH position 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd purge-new [--yes-i- purge all traces of an OSD that was 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: really-mean-it] partially created but never started 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd repair initiate repair on osd , or use 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: to repair all 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd require-osd-release [--yes-i-really-mean-it] 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd reweight reweight osd to 0.0 < < 1.0 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd reweight-by-pg [] [] [] [overload-percentage-for-consideration, 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: [...] default 120] 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd reweight-by-utilization [] [] [] [--no-increasing] 20] 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd reweightn reweight osds with {: ,...} 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd rm-pg-upmap clear pg_upmap mapping for ( 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: developers only) 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd rm-pg-upmap-items clear pg_upmap_items mapping for 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: (developers only) 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd rm-pg-upmap-primary clear pg primary setting for 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd rm-pg-upmap-primary-all clear all pg primary entries ( 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: developers only) 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd rm-primary-temp clear primary_temp mapping pgid ( 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: developers only) 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd safe-to-destroy ... check whether osd(s) can be safely 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: destroyed without reducing data 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: durability 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd scrub initiate scrub on osd , or use 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: to scrub all 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd set 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: noout|noin|nobackfill|norebalance| 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: norecover|noscrub|nodeep-scrub| 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: notieragent|nosnaptrim|pglog_ 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: hardlimit|noautoscale> [--yes-i- 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: really-mean-it] 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd set-allow-crimson [--yes-i-really- Allow crimson-osds to boot and join the 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: mean-it] cluster. Note, crimson-osd is not yet 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: considered stable and may crash or 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: cause data loss -- should be avoided 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: outside of testing and development. 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: This setting is irrevocable 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd set-backfillfull-ratio marked too full to backfill 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd set-full-ratio set usage ratio at which OSDs are 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: marked full 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd set-group ... set for batch osds or crush 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: nodes, must be a comma- 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: separated subset of {noup,nodown,noin, 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: noout} 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd set-nearfull-ratio set usage ratio at which OSDs are 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: marked near-full 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd set-require-min-compat-client set the minimum client version we will 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: [--yes-i-really-mean-it] maintain compatibility with 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd setcrushmap [] set crush map from input file 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd setmaxosd set new maximum osd value 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd stat print summary of OSD map 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd status [] [--format Show the status of OSDs within a bucket, 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: ] or all 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout:osd stop ... stop the corresponding osd daemons and 2026-03-31T20:29:31.193 INFO:tasks.workunit.client.0.vm03.stdout: mark them as down 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout:osd test-reweight-by-pg [] dry run of reweight OSDs by PG 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout: [] [] distribution [overload-percentage-for- 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout: [...] consideration, default 120] 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout:osd test-reweight-by-utilization dry run of reweight OSDs by utilization 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout: [] [] [overload-percentage-for-consideration, 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout: [] [--no-increasing] default 120] 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout:osd tier add [-- add the tier (the second one) 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout: force-nonempty] to base pool (the first one) 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout:osd tier add-cache add a cache (the second one) 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout: of size to existing pool 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout: (the first one) 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout:osd tier cache-mode 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout: none> [--yes-i-really-mean-it] 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout:osd tier rm remove the tier (the second 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout: one) from base pool (the first 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout: one) 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout:osd tier rm-overlay remove the overlay pool for base pool 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout:osd tier set-overlay set the overlay pool for base pool 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout: to be 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout:osd tree [] [...] 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout:osd tree-from [] print OSD tree in bucket 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout: [...] 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout:osd unpause unpause osd 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout:osd unset 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout: noout|noin|nobackfill|norebalance| 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout: norecover|noscrub|nodeep-scrub| 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout: notieragent|nosnaptrim|noautoscale> 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout:osd unset-group ... unset for batch osds or crush 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout: nodes, must be a comma- 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout: separated subset of {noup,nodown,noin, 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout: noout} 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout:osd utilization get basic pg distribution stats 2026-03-31T20:29:31.194 INFO:tasks.workunit.client.0.vm03.stdout:osd versions check running versions of OSDs 2026-03-31T20:29:31.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1902: test_mon_osd: ceph osd setmaxosd 3 2026-03-31T20:29:33.035 INFO:tasks.workunit.client.0.vm03.stderr:set new max_osd = 3 2026-03-31T20:29:33.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1903: test_mon_osd: ceph osd getmaxosd 2026-03-31T20:29:33.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1903: test_mon_osd: grep 'max_osd = 3' 2026-03-31T20:29:33.254 INFO:tasks.workunit.client.0.vm03.stdout:max_osd = 3 in epoch 344 2026-03-31T20:29:33.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1904: test_mon_osd: local max_osd=3 2026-03-31T20:29:33.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1906: test_mon_osd: ceph osd create 98628724-067b-48d0-ba8a-821b377a9176 0 2026-03-31T20:29:33.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1906: test_mon_osd: grep EINVAL 2026-03-31T20:29:33.410 INFO:tasks.workunit.client.0.vm03.stdout:Error EINVAL: id 0 already in use and does not match uuid 98628724-067b-48d0-ba8a-821b377a9176 2026-03-31T20:29:33.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1907: test_mon_osd: ceph osd create 98628724-067b-48d0-ba8a-821b377a9176 2 2026-03-31T20:29:33.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1907: test_mon_osd: grep EINVAL 2026-03-31T20:29:33.563 INFO:tasks.workunit.client.0.vm03.stdout:Error EINVAL: id 2 already in use and does not match uuid 98628724-067b-48d0-ba8a-821b377a9176 2026-03-31T20:29:33.563 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1909: test_mon_osd: ceph osd create 98628724-067b-48d0-ba8a-821b377a9176 3 2026-03-31T20:29:34.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1909: test_mon_osd: id=3 2026-03-31T20:29:34.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1910: test_mon_osd: '[' 3 = 3 ']' 2026-03-31T20:29:34.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1911: test_mon_osd: ceph osd find 3 2026-03-31T20:29:34.298 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:29:34.298 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 3, 2026-03-31T20:29:34.298 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:29:34.298 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [] 2026-03-31T20:29:34.298 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:29:34.298 INFO:tasks.workunit.client.0.vm03.stdout: "osd_fsid": "98628724-067b-48d0-ba8a-821b377a9176", 2026-03-31T20:29:34.298 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": {} 2026-03-31T20:29:34.298 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:29:34.309 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1912: test_mon_osd: max_osd=4 2026-03-31T20:29:34.309 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1913: test_mon_osd: ceph osd getmaxosd 2026-03-31T20:29:34.309 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1913: test_mon_osd: grep 'max_osd = 4' 2026-03-31T20:29:34.512 INFO:tasks.workunit.client.0.vm03.stdout:max_osd = 4 in epoch 345 2026-03-31T20:29:34.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1915: test_mon_osd: ceph osd create 98628724-067b-48d0-ba8a-821b377a9176 2 2026-03-31T20:29:34.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1915: test_mon_osd: grep EEXIST 2026-03-31T20:29:34.666 INFO:tasks.workunit.client.0.vm03.stdout:Error EEXIST: uuid 98628724-067b-48d0-ba8a-821b377a9176 already in use for different id 3 2026-03-31T20:29:34.666 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1916: test_mon_osd: ceph osd create 98628724-067b-48d0-ba8a-821b377a9176 4 2026-03-31T20:29:34.666 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1916: test_mon_osd: grep EEXIST 2026-03-31T20:29:34.820 INFO:tasks.workunit.client.0.vm03.stdout:Error EEXIST: uuid 98628724-067b-48d0-ba8a-821b377a9176 already in use for different id 3 2026-03-31T20:29:34.820 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1917: test_mon_osd: ceph osd create 98628724-067b-48d0-ba8a-821b377a9176 2026-03-31T20:29:35.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1917: test_mon_osd: id2=3 2026-03-31T20:29:35.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1918: test_mon_osd: '[' 3 = 3 ']' 2026-03-31T20:29:35.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1919: test_mon_osd: ceph osd create 98628724-067b-48d0-ba8a-821b377a9176 3 2026-03-31T20:29:35.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1919: test_mon_osd: id2=3 2026-03-31T20:29:35.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1920: test_mon_osd: '[' 3 = 3 ']' 2026-03-31T20:29:35.247 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1922: test_mon_osd: uuidgen 2026-03-31T20:29:35.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1922: test_mon_osd: uuid=2b160d57-d980-4503-9cdc-170be7fecfad 2026-03-31T20:29:35.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1923: test_mon_osd: local gap_start=4 2026-03-31T20:29:35.248 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1924: test_mon_osd: ceph osd create 2b160d57-d980-4503-9cdc-170be7fecfad 104 2026-03-31T20:29:36.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1924: test_mon_osd: id=104 2026-03-31T20:29:36.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1925: test_mon_osd: '[' 104 = 104 ']' 2026-03-31T20:29:36.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1926: test_mon_osd: max_osd=105 2026-03-31T20:29:36.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1927: test_mon_osd: ceph osd getmaxosd 2026-03-31T20:29:36.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1927: test_mon_osd: grep 'max_osd = 105' 2026-03-31T20:29:36.326 INFO:tasks.workunit.client.0.vm03.stdout:max_osd = 105 in epoch 346 2026-03-31T20:29:36.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1929: test_mon_osd: ceph osd create 2b160d57-d980-4503-9cdc-170be7fecfad 4 2026-03-31T20:29:36.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1929: test_mon_osd: grep EEXIST 2026-03-31T20:29:36.488 INFO:tasks.workunit.client.0.vm03.stdout:Error EEXIST: uuid 2b160d57-d980-4503-9cdc-170be7fecfad already in use for different id 104 2026-03-31T20:29:36.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1935: test_mon_osd: local next_osd=4 2026-03-31T20:29:36.489 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1936: test_mon_osd: uuidgen 2026-03-31T20:29:36.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1936: test_mon_osd: ceph osd create f232863d-3e78-4f79-9f3e-e6915f326670 2026-03-31T20:29:37.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1936: test_mon_osd: id=4 2026-03-31T20:29:37.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1937: test_mon_osd: '[' 4 = 4 ']' 2026-03-31T20:29:37.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1939: test_mon_osd: next_osd=5 2026-03-31T20:29:37.122 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1940: test_mon_osd: uuidgen 2026-03-31T20:29:37.123 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1940: test_mon_osd: ceph osd create adbdef1a-36d3-46a1-907d-cc02c0e35241 5 2026-03-31T20:29:38.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1940: test_mon_osd: id=5 2026-03-31T20:29:38.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1941: test_mon_osd: '[' 5 = 5 ']' 2026-03-31T20:29:38.132 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1943: test_mon_osd: ceph osd ls 2026-03-31T20:29:38.345 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1943: test_mon_osd: echo 0 1 2 3 4 5 104 2026-03-31T20:29:38.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1943: test_mon_osd: local 'new_osds=0 1 2 3 4 5 104' 2026-03-31T20:29:38.345 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1944: test_mon_osd: echo 0 1 2 3 4 5 104 2026-03-31T20:29:38.345 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1944: test_mon_osd: sed -e 's/0 1 2//' 2026-03-31T20:29:38.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1944: test_mon_osd: for id in $(echo $new_osds | sed -e "s/$old_osds//") 2026-03-31T20:29:38.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1945: test_mon_osd: '[' 3 -ge 3 ']' 2026-03-31T20:29:38.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1946: test_mon_osd: ceph osd rm 3 2026-03-31T20:29:38.499 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:29:38.494+0000 7f8708384640 -1 mon.a@0(leader).osd e348 definitely_dead 0 2026-03-31T20:29:39.127 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:29:39.122+0000 7f8708384640 -1 mon.a@0(leader).osd e349 definitely_dead 0 2026-03-31T20:29:39.127 INFO:tasks.workunit.client.0.vm03.stderr:osd.3 does not exist. 2026-03-31T20:29:39.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1944: test_mon_osd: for id in $(echo $new_osds | sed -e "s/$old_osds//") 2026-03-31T20:29:39.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1945: test_mon_osd: '[' 4 -ge 3 ']' 2026-03-31T20:29:39.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1946: test_mon_osd: ceph osd rm 4 2026-03-31T20:29:39.289 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:29:39.286+0000 7f8708384640 -1 mon.a@0(leader).osd e349 definitely_dead 0 2026-03-31T20:29:40.132 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:29:40.126+0000 7f8708384640 -1 mon.a@0(leader).osd e350 definitely_dead 0 2026-03-31T20:29:40.132 INFO:tasks.workunit.client.0.vm03.stderr:osd.4 does not exist. 2026-03-31T20:29:40.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1944: test_mon_osd: for id in $(echo $new_osds | sed -e "s/$old_osds//") 2026-03-31T20:29:40.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1945: test_mon_osd: '[' 5 -ge 3 ']' 2026-03-31T20:29:40.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1946: test_mon_osd: ceph osd rm 5 2026-03-31T20:29:40.299 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:29:40.294+0000 7f8708384640 -1 mon.a@0(leader).osd e350 definitely_dead 0 2026-03-31T20:29:41.143 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:29:41.138+0000 7f8708384640 -1 mon.a@0(leader).osd e351 definitely_dead 0 2026-03-31T20:29:41.143 INFO:tasks.workunit.client.0.vm03.stderr:osd.5 does not exist. 2026-03-31T20:29:41.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1944: test_mon_osd: for id in $(echo $new_osds | sed -e "s/$old_osds//") 2026-03-31T20:29:41.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1945: test_mon_osd: '[' 104 -ge 3 ']' 2026-03-31T20:29:41.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1946: test_mon_osd: ceph osd rm 104 2026-03-31T20:29:41.304 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:29:41.298+0000 7f8708384640 -1 mon.a@0(leader).osd e351 definitely_dead 0 2026-03-31T20:29:42.148 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:29:42.142+0000 7f8708384640 -1 mon.a@0(leader).osd e352 definitely_dead 0 2026-03-31T20:29:42.149 INFO:tasks.workunit.client.0.vm03.stderr:osd.104 does not exist. 2026-03-31T20:29:42.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1948: test_mon_osd: ceph osd setmaxosd 3 2026-03-31T20:29:44.112 INFO:tasks.workunit.client.0.vm03.stderr:set new max_osd = 3 2026-03-31T20:29:44.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1950: test_mon_osd: ceph osd ls 2026-03-31T20:29:44.324 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-31T20:29:44.324 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-31T20:29:44.324 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-31T20:29:44.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1951: test_mon_osd: ceph osd pool create data 16 2026-03-31T20:29:45.170 INFO:tasks.workunit.client.0.vm03.stderr:pool 'data' already exists 2026-03-31T20:29:45.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1952: test_mon_osd: ceph osd pool application enable data rados 2026-03-31T20:29:47.129 INFO:tasks.workunit.client.0.vm03.stderr:enabled application 'rados' on pool 'data' 2026-03-31T20:29:47.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1953: test_mon_osd: ceph osd lspools 2026-03-31T20:29:47.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1953: test_mon_osd: grep data 2026-03-31T20:29:47.358 INFO:tasks.workunit.client.0.vm03.stdout:25 data 2026-03-31T20:29:47.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1954: test_mon_osd: ceph osd map data foo 2026-03-31T20:29:47.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1954: test_mon_osd: grep 'pool.*data.*object.*foo.*pg.*up.*acting' 2026-03-31T20:29:47.562 INFO:tasks.workunit.client.0.vm03.stdout:osdmap e357 pool 'data' (25) object 'foo' -> pg 25.7fc1f406 (25.6) -> up ([2,1], p2) acting ([2,1], p2) 2026-03-31T20:29:47.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1955: test_mon_osd: ceph osd map data foo namespace 2026-03-31T20:29:47.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1955: test_mon_osd: grep 'pool.*data.*object.*namespace/foo.*pg.*up.*acting' 2026-03-31T20:29:47.770 INFO:tasks.workunit.client.0.vm03.stdout:osdmap e357 pool 'data' (25) object 'namespace/foo' -> pg 25.9091039 (25.9) -> up ([2,1], p2) acting ([2,1], p2) 2026-03-31T20:29:47.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1956: test_mon_osd: ceph osd pool delete data data --yes-i-really-really-mean-it 2026-03-31T20:29:48.195 INFO:tasks.workunit.client.0.vm03.stderr:pool 'data' does not exist 2026-03-31T20:29:48.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1958: test_mon_osd: ceph osd pause 2026-03-31T20:29:50.154 INFO:tasks.workunit.client.0.vm03.stderr:pauserd,pausewr is set 2026-03-31T20:29:50.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1959: test_mon_osd: ceph osd dump 2026-03-31T20:29:50.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1959: test_mon_osd: grep 'flags.*pauserd,pausewr' 2026-03-31T20:29:50.373 INFO:tasks.workunit.client.0.vm03.stdout:flags pauserd,pausewr,sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit 2026-03-31T20:29:50.373 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1960: test_mon_osd: ceph osd unpause 2026-03-31T20:29:52.164 INFO:tasks.workunit.client.0.vm03.stderr:pauserd,pausewr is unset 2026-03-31T20:29:52.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1962: test_mon_osd: ceph osd tree 2026-03-31T20:29:52.370 INFO:tasks.workunit.client.0.vm03.stdout:ID CLASS WEIGHT TYPE NAME STATUS REWEIGHT PRI-AFF 2026-03-31T20:29:52.370 INFO:tasks.workunit.client.0.vm03.stdout:-1 0.26367 root default 2026-03-31T20:29:52.370 INFO:tasks.workunit.client.0.vm03.stdout:-3 0.26367 host vm03 2026-03-31T20:29:52.370 INFO:tasks.workunit.client.0.vm03.stdout: 0 hdd 0.08789 osd.0 up 0.50000 1.00000 2026-03-31T20:29:52.370 INFO:tasks.workunit.client.0.vm03.stdout: 1 hdd 0.08789 osd.1 up 1.00000 1.00000 2026-03-31T20:29:52.370 INFO:tasks.workunit.client.0.vm03.stdout: 2 hdd 0.08789 osd.2 up 1.00000 1.00000 2026-03-31T20:29:52.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1963: test_mon_osd: ceph osd tree up 2026-03-31T20:29:52.572 INFO:tasks.workunit.client.0.vm03.stdout:ID CLASS WEIGHT TYPE NAME STATUS REWEIGHT PRI-AFF 2026-03-31T20:29:52.572 INFO:tasks.workunit.client.0.vm03.stdout:-1 0.26367 root default 2026-03-31T20:29:52.572 INFO:tasks.workunit.client.0.vm03.stdout:-3 0.26367 host vm03 2026-03-31T20:29:52.572 INFO:tasks.workunit.client.0.vm03.stdout: 0 hdd 0.08789 osd.0 up 0.50000 1.00000 2026-03-31T20:29:52.572 INFO:tasks.workunit.client.0.vm03.stdout: 1 hdd 0.08789 osd.1 up 1.00000 1.00000 2026-03-31T20:29:52.572 INFO:tasks.workunit.client.0.vm03.stdout: 2 hdd 0.08789 osd.2 up 1.00000 1.00000 2026-03-31T20:29:52.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1964: test_mon_osd: ceph osd tree down 2026-03-31T20:29:52.775 INFO:tasks.workunit.client.0.vm03.stdout:ID CLASS WEIGHT TYPE NAME STATUS REWEIGHT PRI-AFF 2026-03-31T20:29:52.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1965: test_mon_osd: ceph osd tree in 2026-03-31T20:29:52.989 INFO:tasks.workunit.client.0.vm03.stdout:ID CLASS WEIGHT TYPE NAME STATUS REWEIGHT PRI-AFF 2026-03-31T20:29:52.989 INFO:tasks.workunit.client.0.vm03.stdout:-1 0.26367 root default 2026-03-31T20:29:52.989 INFO:tasks.workunit.client.0.vm03.stdout:-3 0.26367 host vm03 2026-03-31T20:29:52.989 INFO:tasks.workunit.client.0.vm03.stdout: 0 hdd 0.08789 osd.0 up 0.50000 1.00000 2026-03-31T20:29:52.989 INFO:tasks.workunit.client.0.vm03.stdout: 1 hdd 0.08789 osd.1 up 1.00000 1.00000 2026-03-31T20:29:52.989 INFO:tasks.workunit.client.0.vm03.stdout: 2 hdd 0.08789 osd.2 up 1.00000 1.00000 2026-03-31T20:29:53.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1966: test_mon_osd: ceph osd tree out 2026-03-31T20:29:53.202 INFO:tasks.workunit.client.0.vm03.stdout:ID CLASS WEIGHT TYPE NAME STATUS REWEIGHT PRI-AFF 2026-03-31T20:29:53.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1967: test_mon_osd: ceph osd tree destroyed 2026-03-31T20:29:53.411 INFO:tasks.workunit.client.0.vm03.stdout:ID CLASS WEIGHT TYPE NAME STATUS REWEIGHT PRI-AFF 2026-03-31T20:29:53.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1968: test_mon_osd: ceph osd tree up in 2026-03-31T20:29:53.625 INFO:tasks.workunit.client.0.vm03.stdout:ID CLASS WEIGHT TYPE NAME STATUS REWEIGHT PRI-AFF 2026-03-31T20:29:53.625 INFO:tasks.workunit.client.0.vm03.stdout:-1 0.26367 root default 2026-03-31T20:29:53.625 INFO:tasks.workunit.client.0.vm03.stdout:-3 0.26367 host vm03 2026-03-31T20:29:53.625 INFO:tasks.workunit.client.0.vm03.stdout: 0 hdd 0.08789 osd.0 up 0.50000 1.00000 2026-03-31T20:29:53.625 INFO:tasks.workunit.client.0.vm03.stdout: 1 hdd 0.08789 osd.1 up 1.00000 1.00000 2026-03-31T20:29:53.625 INFO:tasks.workunit.client.0.vm03.stdout: 2 hdd 0.08789 osd.2 up 1.00000 1.00000 2026-03-31T20:29:53.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1969: test_mon_osd: ceph osd tree up out 2026-03-31T20:29:53.831 INFO:tasks.workunit.client.0.vm03.stdout:ID CLASS WEIGHT TYPE NAME STATUS REWEIGHT PRI-AFF 2026-03-31T20:29:53.831 INFO:tasks.workunit.client.0.vm03.stdout:-1 0.26367 root default 2026-03-31T20:29:53.831 INFO:tasks.workunit.client.0.vm03.stdout:-3 0.26367 host vm03 2026-03-31T20:29:53.831 INFO:tasks.workunit.client.0.vm03.stdout: 0 hdd 0.08789 osd.0 up 0.50000 1.00000 2026-03-31T20:29:53.831 INFO:tasks.workunit.client.0.vm03.stdout: 1 hdd 0.08789 osd.1 up 1.00000 1.00000 2026-03-31T20:29:53.831 INFO:tasks.workunit.client.0.vm03.stdout: 2 hdd 0.08789 osd.2 up 1.00000 1.00000 2026-03-31T20:29:53.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1970: test_mon_osd: ceph osd tree down in 2026-03-31T20:29:54.041 INFO:tasks.workunit.client.0.vm03.stdout:ID CLASS WEIGHT TYPE NAME STATUS REWEIGHT PRI-AFF 2026-03-31T20:29:54.041 INFO:tasks.workunit.client.0.vm03.stdout:-1 0.26367 root default 2026-03-31T20:29:54.041 INFO:tasks.workunit.client.0.vm03.stdout:-3 0.26367 host vm03 2026-03-31T20:29:54.041 INFO:tasks.workunit.client.0.vm03.stdout: 0 hdd 0.08789 osd.0 up 0.50000 1.00000 2026-03-31T20:29:54.041 INFO:tasks.workunit.client.0.vm03.stdout: 1 hdd 0.08789 osd.1 up 1.00000 1.00000 2026-03-31T20:29:54.041 INFO:tasks.workunit.client.0.vm03.stdout: 2 hdd 0.08789 osd.2 up 1.00000 1.00000 2026-03-31T20:29:54.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1971: test_mon_osd: ceph osd tree down out 2026-03-31T20:29:54.249 INFO:tasks.workunit.client.0.vm03.stdout:ID CLASS WEIGHT TYPE NAME STATUS REWEIGHT PRI-AFF 2026-03-31T20:29:54.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1972: test_mon_osd: ceph osd tree out down 2026-03-31T20:29:54.455 INFO:tasks.workunit.client.0.vm03.stdout:ID CLASS WEIGHT TYPE NAME STATUS REWEIGHT PRI-AFF 2026-03-31T20:29:54.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1973: test_mon_osd: expect_false ceph osd tree up down 2026-03-31T20:29:54.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:29:54.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd tree up down 2026-03-31T20:29:54.615 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: can specify only one of 'up', 'down' and 'destroyed' 2026-03-31T20:29:54.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:29:54.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1974: test_mon_osd: expect_false ceph osd tree up destroyed 2026-03-31T20:29:54.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:29:54.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd tree up destroyed 2026-03-31T20:29:54.780 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: can specify only one of 'up', 'down' and 'destroyed' 2026-03-31T20:29:54.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:29:54.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1975: test_mon_osd: expect_false ceph osd tree down destroyed 2026-03-31T20:29:54.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:29:54.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd tree down destroyed 2026-03-31T20:29:54.944 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: can specify only one of 'up', 'down' and 'destroyed' 2026-03-31T20:29:54.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:29:54.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1976: test_mon_osd: expect_false ceph osd tree up down destroyed 2026-03-31T20:29:54.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:29:54.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd tree up down destroyed 2026-03-31T20:29:55.101 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: can specify only one of 'up', 'down' and 'destroyed' 2026-03-31T20:29:55.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:29:55.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1977: test_mon_osd: expect_false ceph osd tree in out 2026-03-31T20:29:55.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:29:55.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd tree in out 2026-03-31T20:29:55.257 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: cannot specify both 'in' and 'out' 2026-03-31T20:29:55.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:29:55.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1978: test_mon_osd: expect_false ceph osd tree up foo 2026-03-31T20:29:55.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:29:55.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd tree up foo 2026-03-31T20:29:55.413 INFO:tasks.workunit.client.0.vm03.stderr:foo not valid: foo not in up|down|in|out|destroyed 2026-03-31T20:29:55.413 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: unused arguments: ['foo'] 2026-03-31T20:29:55.413 INFO:tasks.workunit.client.0.vm03.stderr:osd tree [] [...] : print OSD tree 2026-03-31T20:29:55.413 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:29:55.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:29:55.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1980: test_mon_osd: ceph osd metadata 2026-03-31T20:29:55.614 INFO:tasks.workunit.client.0.vm03.stdout:[ 2026-03-31T20:29:55.614 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "id": 0, 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "arch": "x86_64", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "back_addr": "v2:192.168.123.103:6813/951776786", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "back_iface": "", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "bluefs": "1", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "bluefs_dedicated_db": "0", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "bluefs_dedicated_wal": "0", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "bluefs_single_shared_device": "1", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_allocation_from_file": "0", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_allocator": "avl", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_access_mode": "file", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_block_size": "4096", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_devices": "vdb", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_driver": "KernelDevice", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_optimal_io_size": "0", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_path": "/var/lib/ceph/osd/ceph-0/block", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_rotational": "1", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_size": "96636764160", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_support_discard": "1", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_type": "hdd", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_min_alloc_size": "4096", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_onode_segmentation": "active", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_write_mode": "new", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_release": "tentacle", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_version": "ceph version 20.2.0-721-g5bb32787 (5bb3278730741031382ca9c3dc9d221a942e06a2) tentacle (stable - None)", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_version_short": "20.2.0-721-g5bb32787", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_version_when_created": "ceph version 20.2.0-721-g5bb32787 (5bb3278730741031382ca9c3dc9d221a942e06a2) tentacle (stable - None)", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "cpu": "AMD Ryzen 9 7950X3D 16-Core Processor", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "created_at": "2026-03-31T20:21:20.355983Z", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "default_device_class": "hdd", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "device_ids": "vdb=DWNBRSTVMM03001", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "device_paths": "vdb=/dev/disk/by-path/pci-0000:00:06.0", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "devices": "vdb", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "distro": "ubuntu", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "distro_description": "Ubuntu 22.04.5 LTS", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "distro_version": "22.04", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "front_addr": "v2:192.168.123.103:6804/950776786", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "front_iface": "", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "hb_back_addr": "v2:192.168.123.103:6807/950776786", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "hb_front_addr": "v2:192.168.123.103:6806/950776786", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "hostname": "vm03", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "journal_rotational": "1", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "kernel_description": "#181-Ubuntu SMP Fri Feb 6 22:44:50 UTC 2026", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "kernel_version": "5.15.0-171-generic", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "mem_swap_kb": "0", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "mem_total_kb": "10179900", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "network_numa_unknown_ifaces": "back_iface,front_iface", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "objectstore_numa_unknown_devices": "vdb", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "os": "Linux", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "osd_data": "/var/lib/ceph/osd/ceph-0", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "osd_objectstore": "bluestore", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "osdspec_affinity": "", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "rotational": "1" 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "id": 1, 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "arch": "x86_64", 2026-03-31T20:29:55.615 INFO:tasks.workunit.client.0.vm03.stdout: "back_addr": "v2:192.168.123.103:6801/3578452477", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "back_iface": "", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluefs": "1", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluefs_dedicated_db": "0", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluefs_dedicated_wal": "0", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluefs_single_shared_device": "1", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_allocation_from_file": "0", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_allocator": "avl", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_access_mode": "file", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_block_size": "4096", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_devices": "vdc", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_driver": "KernelDevice", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_optimal_io_size": "0", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_path": "/var/lib/ceph/osd/ceph-1/block", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_rotational": "1", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_size": "96636764160", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_support_discard": "1", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_type": "hdd", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_min_alloc_size": "4096", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_onode_segmentation": "active", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_write_mode": "new", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_release": "tentacle", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_version": "ceph version 20.2.0-721-g5bb32787 (5bb3278730741031382ca9c3dc9d221a942e06a2) tentacle (stable - None)", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_version_short": "20.2.0-721-g5bb32787", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_version_when_created": "ceph version 20.2.0-721-g5bb32787 (5bb3278730741031382ca9c3dc9d221a942e06a2) tentacle (stable - None)", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "cpu": "AMD Ryzen 9 7950X3D 16-Core Processor", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "created_at": "2026-03-31T20:21:21.343298Z", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "default_device_class": "hdd", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "device_ids": "vdc=DWNBRSTVMM03002", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "device_paths": "vdc=/dev/disk/by-path/pci-0000:00:07.0", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "devices": "vdc", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "distro": "ubuntu", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "distro_description": "Ubuntu 22.04.5 LTS", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "distro_version": "22.04", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "front_addr": "v2:192.168.123.103:6800/3578452477", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "front_iface": "", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "hb_back_addr": "v2:192.168.123.103:6803/3578452477", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "hb_front_addr": "v2:192.168.123.103:6802/3578452477", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "hostname": "vm03", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "journal_rotational": "1", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "kernel_description": "#181-Ubuntu SMP Fri Feb 6 22:44:50 UTC 2026", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "kernel_version": "5.15.0-171-generic", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "mem_swap_kb": "0", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "mem_total_kb": "10179900", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "network_numa_unknown_ifaces": "back_iface,front_iface", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "objectstore_numa_unknown_devices": "vdc", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "os": "Linux", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "osd_data": "/var/lib/ceph/osd/ceph-1", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "osd_objectstore": "bluestore", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "osdspec_affinity": "", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "rotational": "1" 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "id": 2, 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "arch": "x86_64", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "back_addr": "v2:192.168.123.103:6809/1523795840", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "back_iface": "", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluefs": "1", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluefs_dedicated_db": "0", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluefs_dedicated_wal": "0", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluefs_single_shared_device": "1", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_allocation_from_file": "0", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_allocator": "avl", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_access_mode": "file", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_block_size": "4096", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_devices": "vdd", 2026-03-31T20:29:55.616 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_driver": "KernelDevice", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_optimal_io_size": "0", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_path": "/var/lib/ceph/osd/ceph-2/block", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_rotational": "1", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_size": "96636764160", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_support_discard": "1", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_bdev_type": "hdd", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_min_alloc_size": "4096", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_onode_segmentation": "active", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore_write_mode": "new", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_release": "tentacle", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_version": "ceph version 20.2.0-721-g5bb32787 (5bb3278730741031382ca9c3dc9d221a942e06a2) tentacle (stable - None)", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_version_short": "20.2.0-721-g5bb32787", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "ceph_version_when_created": "ceph version 20.2.0-721-g5bb32787 (5bb3278730741031382ca9c3dc9d221a942e06a2) tentacle (stable - None)", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "cpu": "AMD Ryzen 9 7950X3D 16-Core Processor", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "created_at": "2026-03-31T20:21:22.350025Z", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "default_device_class": "hdd", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "device_ids": "vdd=DWNBRSTVMM03003", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "device_paths": "vdd=/dev/disk/by-path/pci-0000:00:08.0", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "devices": "vdd", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "distro": "ubuntu", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "distro_description": "Ubuntu 22.04.5 LTS", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "distro_version": "22.04", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "front_addr": "v2:192.168.123.103:6808/1523795840", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "front_iface": "", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "hb_back_addr": "v2:192.168.123.103:6811/1523795840", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "hb_front_addr": "v2:192.168.123.103:6810/1523795840", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "hostname": "vm03", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "journal_rotational": "1", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "kernel_description": "#181-Ubuntu SMP Fri Feb 6 22:44:50 UTC 2026", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "kernel_version": "5.15.0-171-generic", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "mem_swap_kb": "0", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "mem_total_kb": "10179900", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "network_numa_unknown_ifaces": "back_iface,front_iface", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "objectstore_numa_unknown_devices": "vdd", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "os": "Linux", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "osd_data": "/var/lib/ceph/osd/ceph-2", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "osd_objectstore": "bluestore", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "osdspec_affinity": "", 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: "rotational": "1" 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:29:55.617 INFO:tasks.workunit.client.0.vm03.stdout:] 2026-03-31T20:29:55.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1981: test_mon_osd: ceph osd count-metadata os 2026-03-31T20:29:55.838 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:29:55.838 INFO:tasks.workunit.client.0.vm03.stdout: "Linux": 3 2026-03-31T20:29:55.838 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:29:55.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1982: test_mon_osd: ceph osd versions 2026-03-31T20:29:56.060 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:29:56.060 INFO:tasks.workunit.client.0.vm03.stdout: "ceph version 20.2.0-721-g5bb32787 (5bb3278730741031382ca9c3dc9d221a942e06a2) tentacle (stable - None)": 3 2026-03-31T20:29:56.060 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:29:56.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1984: test_mon_osd: ceph osd perf 2026-03-31T20:29:56.287 INFO:tasks.workunit.client.0.vm03.stdout:osd commit_latency(ms) apply_latency(ms) 2026-03-31T20:29:56.287 INFO:tasks.workunit.client.0.vm03.stdout: 2 0 0 2026-03-31T20:29:56.287 INFO:tasks.workunit.client.0.vm03.stdout: 1 1 1 2026-03-31T20:29:56.287 INFO:tasks.workunit.client.0.vm03.stdout: 0 1 1 2026-03-31T20:29:56.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1985: test_mon_osd: ceph osd blocked-by 2026-03-31T20:29:56.496 INFO:tasks.workunit.client.0.vm03.stdout:osd num_blocked 2026-03-31T20:29:56.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1987: test_mon_osd: ceph osd stat 2026-03-31T20:29:56.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1987: test_mon_osd: grep up 2026-03-31T20:29:56.727 INFO:tasks.workunit.client.0.vm03.stdout:3 osds: 3 up (since 2m), 3 in (since 18s); epoch: e362 2026-03-31T20:29:56.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:29:56.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mon_config_key 2026-03-31T20:29:56.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1472: test_mon_config_key: key=asdfasdfqwerqwreasdfuniquesa123df 2026-03-31T20:29:56.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1473: test_mon_config_key: ceph config-key list 2026-03-31T20:29:56.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1473: test_mon_config_key: grep -c asdfasdfqwerqwreasdfuniquesa123df 2026-03-31T20:29:56.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1473: test_mon_config_key: grep 0 2026-03-31T20:29:57.260 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-31T20:29:57.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1474: test_mon_config_key: ceph config-key get asdfasdfqwerqwreasdfuniquesa123df 2026-03-31T20:29:57.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1474: test_mon_config_key: grep -c bar 2026-03-31T20:29:57.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1474: test_mon_config_key: grep 0 2026-03-31T20:29:57.457 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: 2026-03-31T20:29:57.462 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-31T20:29:57.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1475: test_mon_config_key: ceph config-key set asdfasdfqwerqwreasdfuniquesa123df bar 2026-03-31T20:29:57.759 INFO:tasks.workunit.client.0.vm03.stderr:set asdfasdfqwerqwreasdfuniquesa123df 2026-03-31T20:29:57.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1476: test_mon_config_key: ceph config-key get asdfasdfqwerqwreasdfuniquesa123df 2026-03-31T20:29:57.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1476: test_mon_config_key: grep bar 2026-03-31T20:29:58.074 INFO:tasks.workunit.client.0.vm03.stdout:bar 2026-03-31T20:29:58.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1477: test_mon_config_key: ceph config-key list 2026-03-31T20:29:58.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1477: test_mon_config_key: grep -c asdfasdfqwerqwreasdfuniquesa123df 2026-03-31T20:29:58.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1477: test_mon_config_key: grep 1 2026-03-31T20:29:58.379 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-31T20:29:58.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1478: test_mon_config_key: ceph config-key dump 2026-03-31T20:29:58.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1478: test_mon_config_key: grep asdfasdfqwerqwreasdfuniquesa123df 2026-03-31T20:29:58.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1478: test_mon_config_key: grep bar 2026-03-31T20:29:58.687 INFO:tasks.workunit.client.0.vm03.stdout: "asdfasdfqwerqwreasdfuniquesa123df": "bar", 2026-03-31T20:29:58.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1479: test_mon_config_key: ceph config-key rm asdfasdfqwerqwreasdfuniquesa123df 2026-03-31T20:29:58.975 INFO:tasks.workunit.client.0.vm03.stderr:key deleted 2026-03-31T20:29:58.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1480: test_mon_config_key: expect_false ceph config-key get asdfasdfqwerqwreasdfuniquesa123df 2026-03-31T20:29:58.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:29:58.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph config-key get asdfasdfqwerqwreasdfuniquesa123df 2026-03-31T20:29:59.187 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: 2026-03-31T20:29:59.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:29:59.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1481: test_mon_config_key: ceph config-key list 2026-03-31T20:29:59.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1481: test_mon_config_key: grep -c asdfasdfqwerqwreasdfuniquesa123df 2026-03-31T20:29:59.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1481: test_mon_config_key: grep 0 2026-03-31T20:29:59.499 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-31T20:29:59.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1482: test_mon_config_key: ceph config-key dump 2026-03-31T20:29:59.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1482: test_mon_config_key: grep -c asdfasdfqwerqwreasdfuniquesa123df 2026-03-31T20:29:59.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1482: test_mon_config_key: grep 0 2026-03-31T20:29:59.806 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-31T20:29:59.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:30:00.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mon_crush 2026-03-31T20:30:00.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1992: test_mon_crush: f=/tmp/cephtool.sYl/map.26274 2026-03-31T20:30:00.021 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1993: test_mon_crush: ceph osd getcrushmap -o /tmp/cephtool.sYl/map.26274 2026-03-31T20:30:00.021 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1993: test_mon_crush: tail -n1 2026-03-31T20:30:00.233 INFO:tasks.workunit.client.0.vm03.stdout:epoch 21 nextepoch 22 2026-03-31T20:30:00.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1993: test_mon_crush: epoch=21 2026-03-31T20:30:00.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1994: test_mon_crush: '[' -s /tmp/cephtool.sYl/map.26274 ']' 2026-03-31T20:30:00.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1995: test_mon_crush: '[' 21 -gt 1 ']' 2026-03-31T20:30:00.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1996: test_mon_crush: nextepoch=22 2026-03-31T20:30:00.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1997: test_mon_crush: echo epoch 21 nextepoch 22 2026-03-31T20:30:00.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1998: test_mon_crush: rm -f /tmp/cephtool.sYl/map.26274.epoch 2026-03-31T20:30:00.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1999: test_mon_crush: expect_false ceph osd setcrushmap 22 -i /tmp/cephtool.sYl/map.26274 2026-03-31T20:30:00.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:30:00.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd setcrushmap 22 -i /tmp/cephtool.sYl/map.26274 2026-03-31T20:30:00.383 INFO:tasks.workunit.client.0.vm03.stderr:Error EPERM: prior_version 22 != crush version 21 2026-03-31T20:30:00.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:30:00.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2000: test_mon_crush: ceph osd setcrushmap 21 -i /tmp/cephtool.sYl/map.26274 2026-03-31T20:30:00.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2000: test_mon_crush: tail -n1 2026-03-31T20:30:01.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2000: test_mon_crush: gotepoch=22 2026-03-31T20:30:01.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2001: test_mon_crush: echo gotepoch 22 2026-03-31T20:30:01.293 INFO:tasks.workunit.client.0.vm03.stdout:gotepoch 22 2026-03-31T20:30:01.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2002: test_mon_crush: '[' 22 -eq 22 ']' 2026-03-31T20:30:01.293 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2004: test_mon_crush: ceph osd setcrushmap 21 -i /tmp/cephtool.sYl/map.26274 2026-03-31T20:30:01.293 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2004: test_mon_crush: tail -n1 2026-03-31T20:30:01.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2004: test_mon_crush: gotepoch=22 2026-03-31T20:30:01.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2005: test_mon_crush: echo epoch 22 2026-03-31T20:30:01.498 INFO:tasks.workunit.client.0.vm03.stdout:epoch 22 2026-03-31T20:30:01.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2006: test_mon_crush: '[' 22 -eq 22 ']' 2026-03-31T20:30:01.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2007: test_mon_crush: rm /tmp/cephtool.sYl/map.26274 2026-03-31T20:30:01.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:30:01.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mon_osd_create_destroy 2026-03-31T20:30:01.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1335: test_mon_osd_create_destroy: ceph osd new 2026-03-31T20:30:01.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1335: test_mon_osd_create_destroy: grep EINVAL 2026-03-31T20:30:01.859 INFO:tasks.workunit.client.0.vm03.stdout:Error EINVAL: invalid command 2026-03-31T20:30:01.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1336: test_mon_osd_create_destroy: ceph osd new '' -1 2026-03-31T20:30:01.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1336: test_mon_osd_create_destroy: grep EINVAL 2026-03-31T20:30:02.012 INFO:tasks.workunit.client.0.vm03.stdout:Error EINVAL: invalid command 2026-03-31T20:30:02.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1337: test_mon_osd_create_destroy: ceph osd new '' 10 2026-03-31T20:30:02.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1337: test_mon_osd_create_destroy: grep EINVAL 2026-03-31T20:30:02.164 INFO:tasks.workunit.client.0.vm03.stdout:Error EINVAL: invalid command 2026-03-31T20:30:02.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1339: test_mon_osd_create_destroy: ceph osd getmaxosd 2026-03-31T20:30:02.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1339: test_mon_osd_create_destroy: sed -e 's/max_osd = //' -e 's/ in epoch.*//' 2026-03-31T20:30:02.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1339: test_mon_osd_create_destroy: old_maxosd=3 2026-03-31T20:30:02.378 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1341: test_mon_osd_create_destroy: ceph osd ls 2026-03-31T20:30:02.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1341: test_mon_osd_create_destroy: old_osds='0 2026-03-31T20:30:02.586 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-31T20:30:02.586 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-31T20:30:02.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1342: test_mon_osd_create_destroy: ceph osd ls 2026-03-31T20:30:02.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1342: test_mon_osd_create_destroy: wc -l 2026-03-31T20:30:02.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1342: test_mon_osd_create_destroy: num_osds=3 2026-03-31T20:30:02.797 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1344: test_mon_osd_create_destroy: uuidgen 2026-03-31T20:30:02.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1344: test_mon_osd_create_destroy: uuid=4582b05f-5b20-445a-a42b-89d12b4a71ba 2026-03-31T20:30:02.798 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1345: test_mon_osd_create_destroy: ceph osd new 4582b05f-5b20-445a-a42b-89d12b4a71ba 2026-03-31T20:30:03.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1345: test_mon_osd_create_destroy: id=3 2026-03-31T20:30:03.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1347: test_mon_osd_create_destroy: for i in $old_osds 2026-03-31T20:30:03.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1348: test_mon_osd_create_destroy: [[ 0 != \3 ]] 2026-03-31T20:30:03.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1347: test_mon_osd_create_destroy: for i in $old_osds 2026-03-31T20:30:03.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1348: test_mon_osd_create_destroy: [[ 1 != \3 ]] 2026-03-31T20:30:03.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1347: test_mon_osd_create_destroy: for i in $old_osds 2026-03-31T20:30:03.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1348: test_mon_osd_create_destroy: [[ 2 != \3 ]] 2026-03-31T20:30:03.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1351: test_mon_osd_create_destroy: ceph osd find 3 2026-03-31T20:30:03.220 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:30:03.220 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 3, 2026-03-31T20:30:03.220 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:30:03.220 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [] 2026-03-31T20:30:03.220 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:30:03.220 INFO:tasks.workunit.client.0.vm03.stdout: "osd_fsid": "4582b05f-5b20-445a-a42b-89d12b4a71ba", 2026-03-31T20:30:03.220 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": {} 2026-03-31T20:30:03.220 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:30:03.231 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1353: test_mon_osd_create_destroy: ceph osd new 4582b05f-5b20-445a-a42b-89d12b4a71ba 2026-03-31T20:30:03.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1353: test_mon_osd_create_destroy: id2=3 2026-03-31T20:30:03.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1355: test_mon_osd_create_destroy: [[ 3 == 3 ]] 2026-03-31T20:30:03.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1357: test_mon_osd_create_destroy: ceph osd new 4582b05f-5b20-445a-a42b-89d12b4a71ba 3 2026-03-31T20:30:03.652 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-31T20:30:03.664 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1359: test_mon_osd_create_destroy: ceph osd getmaxosd 2026-03-31T20:30:03.665 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1359: test_mon_osd_create_destroy: sed -e 's/max_osd = //' -e 's/ in epoch.*//' 2026-03-31T20:30:03.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1359: test_mon_osd_create_destroy: id3=4 2026-03-31T20:30:03.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1360: test_mon_osd_create_destroy: ceph osd new 4582b05f-5b20-445a-a42b-89d12b4a71ba 5 2026-03-31T20:30:03.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1360: test_mon_osd_create_destroy: grep EEXIST 2026-03-31T20:30:04.028 INFO:tasks.workunit.client.0.vm03.stdout:Error EEXIST: uuid 4582b05f-5b20-445a-a42b-89d12b4a71ba already in use for different id 3 2026-03-31T20:30:04.028 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1362: test_mon_osd_create_destroy: uuidgen 2026-03-31T20:30:04.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1362: test_mon_osd_create_destroy: uuid2=a4200020-4303-4949-9fb6-b0416e703438 2026-03-31T20:30:04.030 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1363: test_mon_osd_create_destroy: ceph osd new a4200020-4303-4949-9fb6-b0416e703438 2026-03-31T20:30:04.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1363: test_mon_osd_create_destroy: id2=4 2026-03-31T20:30:04.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1364: test_mon_osd_create_destroy: ceph osd find 4 2026-03-31T20:30:04.455 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:30:04.455 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 4, 2026-03-31T20:30:04.455 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:30:04.455 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [] 2026-03-31T20:30:04.455 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:30:04.455 INFO:tasks.workunit.client.0.vm03.stdout: "osd_fsid": "a4200020-4303-4949-9fb6-b0416e703438", 2026-03-31T20:30:04.455 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": {} 2026-03-31T20:30:04.455 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:30:04.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1365: test_mon_osd_create_destroy: [[ 4 != \3 ]] 2026-03-31T20:30:04.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1367: test_mon_osd_create_destroy: ceph osd new 4582b05f-5b20-445a-a42b-89d12b4a71ba 4 2026-03-31T20:30:04.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1367: test_mon_osd_create_destroy: grep EEXIST 2026-03-31T20:30:04.622 INFO:tasks.workunit.client.0.vm03.stdout:Error EEXIST: uuid 4582b05f-5b20-445a-a42b-89d12b4a71ba already in use for different id 3 2026-03-31T20:30:04.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1368: test_mon_osd_create_destroy: ceph osd new a4200020-4303-4949-9fb6-b0416e703438 4 2026-03-31T20:30:04.828 INFO:tasks.workunit.client.0.vm03.stdout:4 2026-03-31T20:30:04.839 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1371: test_mon_osd_create_destroy: gen_secrets_file empty 2026-03-31T20:30:04.839 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1293: gen_secrets_file: local t=empty 2026-03-31T20:30:04.839 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1294: gen_secrets_file: [[ -z empty ]] 2026-03-31T20:30:04.839 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1298: gen_secrets_file: mktemp /tmp/cephtool.sYl/secret.XXXXXX 2026-03-31T20:30:04.840 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1298: gen_secrets_file: fn=/tmp/cephtool.sYl/secret.uKnSkH 2026-03-31T20:30:04.840 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1299: gen_secrets_file: echo /tmp/cephtool.sYl/secret.uKnSkH 2026-03-31T20:30:04.841 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1300: gen_secrets_file: [[ empty == \e\m\p\t\y ]] 2026-03-31T20:30:04.841 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1301: gen_secrets_file: return 0 2026-03-31T20:30:04.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1371: test_mon_osd_create_destroy: empty_secrets=/tmp/cephtool.sYl/secret.uKnSkH 2026-03-31T20:30:04.841 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1372: test_mon_osd_create_destroy: gen_secrets_file empty_json 2026-03-31T20:30:04.841 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1293: gen_secrets_file: local t=empty_json 2026-03-31T20:30:04.841 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1294: gen_secrets_file: [[ -z empty_json ]] 2026-03-31T20:30:04.841 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1298: gen_secrets_file: mktemp /tmp/cephtool.sYl/secret.XXXXXX 2026-03-31T20:30:04.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1298: gen_secrets_file: fn=/tmp/cephtool.sYl/secret.VxXuZ9 2026-03-31T20:30:04.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1299: gen_secrets_file: echo /tmp/cephtool.sYl/secret.VxXuZ9 2026-03-31T20:30:04.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1300: gen_secrets_file: [[ empty_json == \e\m\p\t\y ]] 2026-03-31T20:30:04.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1304: gen_secrets_file: echo '{' 2026-03-31T20:30:04.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1305: gen_secrets_file: [[ empty_json == \b\a\d\_\j\s\o\n ]] 2026-03-31T20:30:04.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1308: gen_secrets_file: [[ empty_json == \e\m\p\t\y\_\j\s\o\n ]] 2026-03-31T20:30:04.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1309: gen_secrets_file: echo '}' 2026-03-31T20:30:04.843 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1310: gen_secrets_file: return 0 2026-03-31T20:30:04.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1372: test_mon_osd_create_destroy: empty_json=/tmp/cephtool.sYl/secret.VxXuZ9 2026-03-31T20:30:04.843 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1373: test_mon_osd_create_destroy: gen_secrets_file all 2026-03-31T20:30:04.843 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1293: gen_secrets_file: local t=all 2026-03-31T20:30:04.843 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1294: gen_secrets_file: [[ -z all ]] 2026-03-31T20:30:04.843 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1298: gen_secrets_file: mktemp /tmp/cephtool.sYl/secret.XXXXXX 2026-03-31T20:30:04.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1298: gen_secrets_file: fn=/tmp/cephtool.sYl/secret.TrF5yY 2026-03-31T20:30:04.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1299: gen_secrets_file: echo /tmp/cephtool.sYl/secret.TrF5yY 2026-03-31T20:30:04.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1300: gen_secrets_file: [[ all == \e\m\p\t\y ]] 2026-03-31T20:30:04.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1304: gen_secrets_file: echo '{' 2026-03-31T20:30:04.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1305: gen_secrets_file: [[ all == \b\a\d\_\j\s\o\n ]] 2026-03-31T20:30:04.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1308: gen_secrets_file: [[ all == \e\m\p\t\y\_\j\s\o\n ]] 2026-03-31T20:30:04.844 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1313: gen_secrets_file: ceph-authtool --gen-print-key 2026-03-31T20:30:04.855 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1313: gen_secrets_file: cephx_secret='"cephx_secret": "AQDMLsxpYdvuMhAAsEP+qgpZMzcEVCt/nH419Q=="' 2026-03-31T20:30:04.855 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1314: gen_secrets_file: ceph-authtool --gen-print-key 2026-03-31T20:30:04.865 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1314: gen_secrets_file: lb_secret='"cephx_lockbox_secret": "AQDMLsxpEs+EMxAAvtBmq8AFcT3homMC5czb8Q=="' 2026-03-31T20:30:04.865 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1315: gen_secrets_file: ceph-authtool --gen-print-key 2026-03-31T20:30:04.874 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1315: gen_secrets_file: dmcrypt_key='"dmcrypt_key": "AQDMLsxpXVMVNBAAJCNnrgT0z37JVpuiIKgfwQ=="' 2026-03-31T20:30:04.874 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1317: gen_secrets_file: [[ all == \a\l\l ]] 2026-03-31T20:30:04.874 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1318: gen_secrets_file: echo '"cephx_secret": "AQDMLsxpYdvuMhAAsEP+qgpZMzcEVCt/nH419Q==","cephx_lockbox_secret": "AQDMLsxpEs+EMxAAvtBmq8AFcT3homMC5czb8Q==","dmcrypt_key": "AQDMLsxpXVMVNBAAJCNnrgT0z37JVpuiIKgfwQ=="' 2026-03-31T20:30:04.874 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1329: gen_secrets_file: echo '}' 2026-03-31T20:30:04.874 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1330: gen_secrets_file: return 0 2026-03-31T20:30:04.874 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1373: test_mon_osd_create_destroy: all_secrets=/tmp/cephtool.sYl/secret.TrF5yY 2026-03-31T20:30:04.875 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1374: test_mon_osd_create_destroy: gen_secrets_file cephx 2026-03-31T20:30:04.875 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1293: gen_secrets_file: local t=cephx 2026-03-31T20:30:04.875 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1294: gen_secrets_file: [[ -z cephx ]] 2026-03-31T20:30:04.875 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1298: gen_secrets_file: mktemp /tmp/cephtool.sYl/secret.XXXXXX 2026-03-31T20:30:04.876 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1298: gen_secrets_file: fn=/tmp/cephtool.sYl/secret.e330pQ 2026-03-31T20:30:04.876 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1299: gen_secrets_file: echo /tmp/cephtool.sYl/secret.e330pQ 2026-03-31T20:30:04.876 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1300: gen_secrets_file: [[ cephx == \e\m\p\t\y ]] 2026-03-31T20:30:04.876 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1304: gen_secrets_file: echo '{' 2026-03-31T20:30:04.876 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1305: gen_secrets_file: [[ cephx == \b\a\d\_\j\s\o\n ]] 2026-03-31T20:30:04.876 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1308: gen_secrets_file: [[ cephx == \e\m\p\t\y\_\j\s\o\n ]] 2026-03-31T20:30:04.876 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1313: gen_secrets_file: ceph-authtool --gen-print-key 2026-03-31T20:30:04.886 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1313: gen_secrets_file: cephx_secret='"cephx_secret": "AQDMLsxpM9PJNBAAVBQFLBKdLPZJ3rdq6ymtOA=="' 2026-03-31T20:30:04.886 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1314: gen_secrets_file: ceph-authtool --gen-print-key 2026-03-31T20:30:04.896 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1314: gen_secrets_file: lb_secret='"cephx_lockbox_secret": "AQDMLsxplt9cNRAAjpXWYAVuu+H6HGlprEPKtg=="' 2026-03-31T20:30:04.896 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1315: gen_secrets_file: ceph-authtool --gen-print-key 2026-03-31T20:30:04.905 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1315: gen_secrets_file: dmcrypt_key='"dmcrypt_key": "AQDMLsxpwNXrNRAAnPtFi718Na4kXGe2w90IEA=="' 2026-03-31T20:30:04.905 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1317: gen_secrets_file: [[ cephx == \a\l\l ]] 2026-03-31T20:30:04.905 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1319: gen_secrets_file: [[ cephx == \c\e\p\h\x ]] 2026-03-31T20:30:04.905 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1320: gen_secrets_file: echo '"cephx_secret": "AQDMLsxpM9PJNBAAVBQFLBKdLPZJ3rdq6ymtOA=="' 2026-03-31T20:30:04.905 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1329: gen_secrets_file: echo '}' 2026-03-31T20:30:04.905 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1330: gen_secrets_file: return 0 2026-03-31T20:30:04.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1374: test_mon_osd_create_destroy: cephx_only=/tmp/cephtool.sYl/secret.e330pQ 2026-03-31T20:30:04.905 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1375: test_mon_osd_create_destroy: gen_secrets_file no_cephx 2026-03-31T20:30:04.905 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1293: gen_secrets_file: local t=no_cephx 2026-03-31T20:30:04.905 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1294: gen_secrets_file: [[ -z no_cephx ]] 2026-03-31T20:30:04.906 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1298: gen_secrets_file: mktemp /tmp/cephtool.sYl/secret.XXXXXX 2026-03-31T20:30:04.907 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1298: gen_secrets_file: fn=/tmp/cephtool.sYl/secret.MD27xQ 2026-03-31T20:30:04.907 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1299: gen_secrets_file: echo /tmp/cephtool.sYl/secret.MD27xQ 2026-03-31T20:30:04.907 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1300: gen_secrets_file: [[ no_cephx == \e\m\p\t\y ]] 2026-03-31T20:30:04.907 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1304: gen_secrets_file: echo '{' 2026-03-31T20:30:04.907 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1305: gen_secrets_file: [[ no_cephx == \b\a\d\_\j\s\o\n ]] 2026-03-31T20:30:04.907 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1308: gen_secrets_file: [[ no_cephx == \e\m\p\t\y\_\j\s\o\n ]] 2026-03-31T20:30:04.907 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1313: gen_secrets_file: ceph-authtool --gen-print-key 2026-03-31T20:30:04.916 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1313: gen_secrets_file: cephx_secret='"cephx_secret": "AQDMLsxp/B+YNhAATkfqQmySMfrR7VAs3vQJXw=="' 2026-03-31T20:30:04.916 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1314: gen_secrets_file: ceph-authtool --gen-print-key 2026-03-31T20:30:04.926 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1314: gen_secrets_file: lb_secret='"cephx_lockbox_secret": "AQDMLsxpQFwmNxAAXfruCKjwshehMRrWiHEfmA=="' 2026-03-31T20:30:04.926 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1315: gen_secrets_file: ceph-authtool --gen-print-key 2026-03-31T20:30:04.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1315: gen_secrets_file: dmcrypt_key='"dmcrypt_key": "AQDMLsxpIx22NxAAnk7cCpIbRKnqk8KGAccW5g=="' 2026-03-31T20:30:04.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1317: gen_secrets_file: [[ no_cephx == \a\l\l ]] 2026-03-31T20:30:04.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1319: gen_secrets_file: [[ no_cephx == \c\e\p\h\x ]] 2026-03-31T20:30:04.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1321: gen_secrets_file: [[ no_cephx == \n\o\_\c\e\p\h\x ]] 2026-03-31T20:30:04.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1322: gen_secrets_file: echo '"cephx_lockbox_secret": "AQDMLsxpQFwmNxAAXfruCKjwshehMRrWiHEfmA==","dmcrypt_key": "AQDMLsxpIx22NxAAnk7cCpIbRKnqk8KGAccW5g=="' 2026-03-31T20:30:04.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1329: gen_secrets_file: echo '}' 2026-03-31T20:30:04.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1330: gen_secrets_file: return 0 2026-03-31T20:30:04.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1375: test_mon_osd_create_destroy: no_cephx=/tmp/cephtool.sYl/secret.MD27xQ 2026-03-31T20:30:04.936 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1376: test_mon_osd_create_destroy: gen_secrets_file no_lockbox 2026-03-31T20:30:04.936 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1293: gen_secrets_file: local t=no_lockbox 2026-03-31T20:30:04.936 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1294: gen_secrets_file: [[ -z no_lockbox ]] 2026-03-31T20:30:04.936 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1298: gen_secrets_file: mktemp /tmp/cephtool.sYl/secret.XXXXXX 2026-03-31T20:30:04.937 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1298: gen_secrets_file: fn=/tmp/cephtool.sYl/secret.wfEQRa 2026-03-31T20:30:04.937 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1299: gen_secrets_file: echo /tmp/cephtool.sYl/secret.wfEQRa 2026-03-31T20:30:04.937 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1300: gen_secrets_file: [[ no_lockbox == \e\m\p\t\y ]] 2026-03-31T20:30:04.937 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1304: gen_secrets_file: echo '{' 2026-03-31T20:30:04.937 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1305: gen_secrets_file: [[ no_lockbox == \b\a\d\_\j\s\o\n ]] 2026-03-31T20:30:04.937 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1308: gen_secrets_file: [[ no_lockbox == \e\m\p\t\y\_\j\s\o\n ]] 2026-03-31T20:30:04.937 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1313: gen_secrets_file: ceph-authtool --gen-print-key 2026-03-31T20:30:04.946 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1313: gen_secrets_file: cephx_secret='"cephx_secret": "AQDMLsxp3CRkOBAAj1Sj1993+4KEo50qLVc5IQ=="' 2026-03-31T20:30:04.947 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1314: gen_secrets_file: ceph-authtool --gen-print-key 2026-03-31T20:30:04.956 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1314: gen_secrets_file: lb_secret='"cephx_lockbox_secret": "AQDMLsxp0DfxOBAA6y40u1xFZU3H9+Nyk2WS7w=="' 2026-03-31T20:30:04.956 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1315: gen_secrets_file: ceph-authtool --gen-print-key 2026-03-31T20:30:04.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1315: gen_secrets_file: dmcrypt_key='"dmcrypt_key": "AQDMLsxpuSh9ORAAzhCXO/YN72vQW4DeLamrqw=="' 2026-03-31T20:30:04.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1317: gen_secrets_file: [[ no_lockbox == \a\l\l ]] 2026-03-31T20:30:04.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1319: gen_secrets_file: [[ no_lockbox == \c\e\p\h\x ]] 2026-03-31T20:30:04.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1321: gen_secrets_file: [[ no_lockbox == \n\o\_\c\e\p\h\x ]] 2026-03-31T20:30:04.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1323: gen_secrets_file: [[ no_lockbox == \n\o\_\l\o\c\k\b\o\x ]] 2026-03-31T20:30:04.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1324: gen_secrets_file: echo '"cephx_secret": "AQDMLsxp3CRkOBAAj1Sj1993+4KEo50qLVc5IQ==","dmcrypt_key": "AQDMLsxpuSh9ORAAzhCXO/YN72vQW4DeLamrqw=="' 2026-03-31T20:30:04.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1329: gen_secrets_file: echo '}' 2026-03-31T20:30:04.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1330: gen_secrets_file: return 0 2026-03-31T20:30:04.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1376: test_mon_osd_create_destroy: no_lockbox=/tmp/cephtool.sYl/secret.wfEQRa 2026-03-31T20:30:04.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1377: test_mon_osd_create_destroy: gen_secrets_file bad_json 2026-03-31T20:30:04.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1293: gen_secrets_file: local t=bad_json 2026-03-31T20:30:04.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1294: gen_secrets_file: [[ -z bad_json ]] 2026-03-31T20:30:04.966 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1298: gen_secrets_file: mktemp /tmp/cephtool.sYl/secret.XXXXXX 2026-03-31T20:30:04.967 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1298: gen_secrets_file: fn=/tmp/cephtool.sYl/secret.6SA3j2 2026-03-31T20:30:04.967 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1299: gen_secrets_file: echo /tmp/cephtool.sYl/secret.6SA3j2 2026-03-31T20:30:04.967 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1300: gen_secrets_file: [[ bad_json == \e\m\p\t\y ]] 2026-03-31T20:30:04.967 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1304: gen_secrets_file: echo '{' 2026-03-31T20:30:04.967 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1305: gen_secrets_file: [[ bad_json == \b\a\d\_\j\s\o\n ]] 2026-03-31T20:30:04.967 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1306: gen_secrets_file: echo 'asd: ; }' 2026-03-31T20:30:04.967 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1307: gen_secrets_file: return 0 2026-03-31T20:30:04.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1377: test_mon_osd_create_destroy: bad_json=/tmp/cephtool.sYl/secret.6SA3j2 2026-03-31T20:30:04.967 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1380: test_mon_osd_create_destroy: ceph osd new 4582b05f-5b20-445a-a42b-89d12b4a71ba 3 -i /tmp/cephtool.sYl/secret.uKnSkH 2026-03-31T20:30:05.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1380: test_mon_osd_create_destroy: new_id=3 2026-03-31T20:30:05.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1381: test_mon_osd_create_destroy: [[ 3 == \3 ]] 2026-03-31T20:30:05.186 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1384: test_mon_osd_create_destroy: ceph osd new 4582b05f-5b20-445a-a42b-89d12b4a71ba 3 -i /tmp/cephtool.sYl/secret.VxXuZ9 2026-03-31T20:30:05.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1384: test_mon_osd_create_destroy: new_id=3 2026-03-31T20:30:05.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1385: test_mon_osd_create_destroy: [[ 3 == \3 ]] 2026-03-31T20:30:05.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1387: test_mon_osd_create_destroy: ceph osd new 4582b05f-5b20-445a-a42b-89d12b4a71ba 3 -i /tmp/cephtool.sYl/secret.TrF5yY 2026-03-31T20:30:05.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1387: test_mon_osd_create_destroy: grep EEXIST 2026-03-31T20:30:05.561 INFO:tasks.workunit.client.0.vm03.stdout:Error EEXIST: osd.3 exists but secrets do not match 2026-03-31T20:30:05.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1389: test_mon_osd_create_destroy: ceph osd rm 3 2026-03-31T20:30:05.713 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:30:05.706+0000 7f8708384640 -1 mon.a@0(leader).osd e365 definitely_dead 0 2026-03-31T20:30:06.306 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:30:06.302+0000 7f8708384640 -1 mon.a@0(leader).osd e366 definitely_dead 0 2026-03-31T20:30:06.306 INFO:tasks.workunit.client.0.vm03.stderr:osd.3 does not exist. 2026-03-31T20:30:06.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1390: test_mon_osd_create_destroy: ceph osd rm 4 2026-03-31T20:30:06.480 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:30:06.474+0000 7f8708384640 -1 mon.a@0(leader).osd e366 definitely_dead 0 2026-03-31T20:30:07.327 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:30:07.322+0000 7f8708384640 -1 mon.a@0(leader).osd e367 definitely_dead 0 2026-03-31T20:30:07.327 INFO:tasks.workunit.client.0.vm03.stderr:osd.4 does not exist. 2026-03-31T20:30:07.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1391: test_mon_osd_create_destroy: ceph osd setmaxosd 3 2026-03-31T20:30:09.281 INFO:tasks.workunit.client.0.vm03.stderr:set new max_osd = 3 2026-03-31T20:30:09.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1393: test_mon_osd_create_destroy: ceph osd new 4582b05f-5b20-445a-a42b-89d12b4a71ba -i /tmp/cephtool.sYl/secret.MD27xQ 2026-03-31T20:30:09.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1393: test_mon_osd_create_destroy: grep EINVAL 2026-03-31T20:30:09.465 INFO:tasks.workunit.client.0.vm03.stdout:Error EINVAL: requires a cephx secret. 2026-03-31T20:30:09.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1394: test_mon_osd_create_destroy: ceph osd new 4582b05f-5b20-445a-a42b-89d12b4a71ba -i /tmp/cephtool.sYl/secret.wfEQRa 2026-03-31T20:30:09.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1394: test_mon_osd_create_destroy: grep EINVAL 2026-03-31T20:30:09.621 INFO:tasks.workunit.client.0.vm03.stdout:Error EINVAL: requires both a cephx lockbox secret and a dm-crypt key. 2026-03-31T20:30:09.622 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1396: test_mon_osd_create_destroy: ceph osd ls 2026-03-31T20:30:09.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1396: test_mon_osd_create_destroy: osds='0 2026-03-31T20:30:09.831 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-31T20:30:09.831 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-31T20:30:09.831 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1397: test_mon_osd_create_destroy: ceph osd new 4582b05f-5b20-445a-a42b-89d12b4a71ba -i /tmp/cephtool.sYl/secret.TrF5yY 2026-03-31T20:30:10.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1397: test_mon_osd_create_destroy: id=3 2026-03-31T20:30:10.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1398: test_mon_osd_create_destroy: for i in $osds 2026-03-31T20:30:10.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1399: test_mon_osd_create_destroy: [[ 0 != \3 ]] 2026-03-31T20:30:10.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1398: test_mon_osd_create_destroy: for i in $osds 2026-03-31T20:30:10.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1399: test_mon_osd_create_destroy: [[ 1 != \3 ]] 2026-03-31T20:30:10.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1398: test_mon_osd_create_destroy: for i in $osds 2026-03-31T20:30:10.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1399: test_mon_osd_create_destroy: [[ 2 != \3 ]] 2026-03-31T20:30:10.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1402: test_mon_osd_create_destroy: ceph osd find 3 2026-03-31T20:30:10.300 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:30:10.300 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 3, 2026-03-31T20:30:10.300 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:30:10.300 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [] 2026-03-31T20:30:10.300 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:30:10.300 INFO:tasks.workunit.client.0.vm03.stdout: "osd_fsid": "4582b05f-5b20-445a-a42b-89d12b4a71ba", 2026-03-31T20:30:10.300 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": {} 2026-03-31T20:30:10.300 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:30:10.314 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1405: test_mon_osd_create_destroy: ceph auth get-key osd.3 --format=json-pretty 2026-03-31T20:30:10.314 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1405: test_mon_osd_create_destroy: jq .key 2026-03-31T20:30:10.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1405: test_mon_osd_create_destroy: k='"AQDMLsxpYdvuMhAAsEP+qgpZMzcEVCt/nH419Q=="' 2026-03-31T20:30:10.590 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1406: test_mon_osd_create_destroy: cat /tmp/cephtool.sYl/secret.TrF5yY 2026-03-31T20:30:10.590 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1406: test_mon_osd_create_destroy: jq .cephx_secret 2026-03-31T20:30:10.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1406: test_mon_osd_create_destroy: s='"AQDMLsxpYdvuMhAAsEP+qgpZMzcEVCt/nH419Q=="' 2026-03-31T20:30:10.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1407: test_mon_osd_create_destroy: [[ "AQDMLsxpYdvuMhAAsEP+qgpZMzcEVCt/nH419Q==" == "AQDMLsxpYdvuMhAAsEP+qgpZMzcEVCt/nH419Q==" ]] 2026-03-31T20:30:10.600 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1409: test_mon_osd_create_destroy: ceph auth get-key client.osd-lockbox.4582b05f-5b20-445a-a42b-89d12b4a71ba --format=json-pretty 2026-03-31T20:30:10.600 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1409: test_mon_osd_create_destroy: jq .key 2026-03-31T20:30:10.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1409: test_mon_osd_create_destroy: k='"AQDMLsxpEs+EMxAAvtBmq8AFcT3homMC5czb8Q=="' 2026-03-31T20:30:10.867 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1410: test_mon_osd_create_destroy: cat /tmp/cephtool.sYl/secret.TrF5yY 2026-03-31T20:30:10.867 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1410: test_mon_osd_create_destroy: jq .cephx_lockbox_secret 2026-03-31T20:30:10.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1410: test_mon_osd_create_destroy: s='"AQDMLsxpEs+EMxAAvtBmq8AFcT3homMC5czb8Q=="' 2026-03-31T20:30:10.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1411: test_mon_osd_create_destroy: [[ "AQDMLsxpEs+EMxAAvtBmq8AFcT3homMC5czb8Q==" == "AQDMLsxpEs+EMxAAvtBmq8AFcT3homMC5czb8Q==" ]] 2026-03-31T20:30:10.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1412: test_mon_osd_create_destroy: ceph config-key exists dm-crypt/osd/4582b05f-5b20-445a-a42b-89d12b4a71ba/luks 2026-03-31T20:30:11.153 INFO:tasks.workunit.client.0.vm03.stderr:key 'dm-crypt/osd/4582b05f-5b20-445a-a42b-89d12b4a71ba/luks' exists 2026-03-31T20:30:11.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1414: test_mon_osd_create_destroy: ceph osd ls 2026-03-31T20:30:11.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1414: test_mon_osd_create_destroy: osds='0 2026-03-31T20:30:11.381 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-31T20:30:11.381 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-31T20:30:11.381 INFO:tasks.workunit.client.0.vm03.stderr:3' 2026-03-31T20:30:11.381 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1415: test_mon_osd_create_destroy: ceph osd new a4200020-4303-4949-9fb6-b0416e703438 -i /tmp/cephtool.sYl/secret.e330pQ 2026-03-31T20:30:11.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1415: test_mon_osd_create_destroy: id2=4 2026-03-31T20:30:11.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1416: test_mon_osd_create_destroy: for i in $osds 2026-03-31T20:30:11.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1417: test_mon_osd_create_destroy: [[ 0 != \4 ]] 2026-03-31T20:30:11.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1416: test_mon_osd_create_destroy: for i in $osds 2026-03-31T20:30:11.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1417: test_mon_osd_create_destroy: [[ 1 != \4 ]] 2026-03-31T20:30:11.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1416: test_mon_osd_create_destroy: for i in $osds 2026-03-31T20:30:11.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1417: test_mon_osd_create_destroy: [[ 2 != \4 ]] 2026-03-31T20:30:11.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1416: test_mon_osd_create_destroy: for i in $osds 2026-03-31T20:30:11.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1417: test_mon_osd_create_destroy: [[ 3 != \4 ]] 2026-03-31T20:30:11.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1420: test_mon_osd_create_destroy: ceph osd find 4 2026-03-31T20:30:11.814 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:30:11.814 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 4, 2026-03-31T20:30:11.814 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:30:11.814 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [] 2026-03-31T20:30:11.814 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:30:11.814 INFO:tasks.workunit.client.0.vm03.stdout: "osd_fsid": "a4200020-4303-4949-9fb6-b0416e703438", 2026-03-31T20:30:11.814 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": {} 2026-03-31T20:30:11.814 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:30:11.827 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1421: test_mon_osd_create_destroy: ceph auth get-key osd.3 --format=json-pretty 2026-03-31T20:30:11.827 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1421: test_mon_osd_create_destroy: jq .key 2026-03-31T20:30:12.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1421: test_mon_osd_create_destroy: k='"AQDMLsxpYdvuMhAAsEP+qgpZMzcEVCt/nH419Q=="' 2026-03-31T20:30:12.102 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1422: test_mon_osd_create_destroy: cat /tmp/cephtool.sYl/secret.TrF5yY 2026-03-31T20:30:12.102 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1422: test_mon_osd_create_destroy: jq .cephx_secret 2026-03-31T20:30:12.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1422: test_mon_osd_create_destroy: s='"AQDMLsxpYdvuMhAAsEP+qgpZMzcEVCt/nH419Q=="' 2026-03-31T20:30:12.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1423: test_mon_osd_create_destroy: [[ "AQDMLsxpYdvuMhAAsEP+qgpZMzcEVCt/nH419Q==" == "AQDMLsxpYdvuMhAAsEP+qgpZMzcEVCt/nH419Q==" ]] 2026-03-31T20:30:12.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1424: test_mon_osd_create_destroy: expect_false ceph auth get-key client.osd-lockbox.a4200020-4303-4949-9fb6-b0416e703438 2026-03-31T20:30:12.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:30:12.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph auth get-key client.osd-lockbox.a4200020-4303-4949-9fb6-b0416e703438 2026-03-31T20:30:12.289 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: don't have client.osd-lockbox.a4200020-4303-4949-9fb6-b0416e703438 2026-03-31T20:30:12.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:30:12.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1425: test_mon_osd_create_destroy: expect_false ceph config-key exists dm-crypt/osd/a4200020-4303-4949-9fb6-b0416e703438/luks 2026-03-31T20:30:12.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:30:12.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph config-key exists dm-crypt/osd/a4200020-4303-4949-9fb6-b0416e703438/luks 2026-03-31T20:30:12.487 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: key 'dm-crypt/osd/a4200020-4303-4949-9fb6-b0416e703438/luks' doesn't exist 2026-03-31T20:30:12.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:30:12.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1427: test_mon_osd_create_destroy: ceph osd destroy osd.4 --yes-i-really-mean-it 2026-03-31T20:30:12.697 INFO:tasks.workunit.client.0.vm03.stderr:destroyed osd.4 2026-03-31T20:30:12.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1428: test_mon_osd_create_destroy: ceph osd destroy 4 --yes-i-really-mean-it 2026-03-31T20:30:12.911 INFO:tasks.workunit.client.0.vm03.stderr:destroyed osd.4 2026-03-31T20:30:12.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1429: test_mon_osd_create_destroy: ceph osd find 4 2026-03-31T20:30:13.125 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:30:13.125 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 4, 2026-03-31T20:30:13.125 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:30:13.125 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [] 2026-03-31T20:30:13.125 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:30:13.125 INFO:tasks.workunit.client.0.vm03.stdout: "osd_fsid": "00000000-0000-0000-0000-000000000000", 2026-03-31T20:30:13.125 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": {} 2026-03-31T20:30:13.125 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:30:13.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1430: test_mon_osd_create_destroy: expect_false ceph auth get-key osd.4 2026-03-31T20:30:13.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:30:13.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph auth get-key osd.4 2026-03-31T20:30:13.317 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: don't have osd.4 2026-03-31T20:30:13.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:30:13.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1431: test_mon_osd_create_destroy: ceph osd dump 2026-03-31T20:30:13.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1431: test_mon_osd_create_destroy: grep osd.4 2026-03-31T20:30:13.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1431: test_mon_osd_create_destroy: grep destroyed 2026-03-31T20:30:13.528 INFO:tasks.workunit.client.0.vm03.stdout:osd.4 down in weight 1 up_from 0 up_thru 0 down_at 0 last_clean_interval [0,0) destroyed,exists,new 2026-03-31T20:30:13.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1433: test_mon_osd_create_destroy: id3=4 2026-03-31T20:30:13.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1434: test_mon_osd_create_destroy: uuidgen 2026-03-31T20:30:13.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1434: test_mon_osd_create_destroy: uuid3=2401dbda-7b4b-4b42-a3aa-dde1d631b46a 2026-03-31T20:30:13.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1435: test_mon_osd_create_destroy: ceph osd new 2401dbda-7b4b-4b42-a3aa-dde1d631b46a 4 -i /tmp/cephtool.sYl/secret.TrF5yY 2026-03-31T20:30:13.749 INFO:tasks.workunit.client.0.vm03.stdout:4 2026-03-31T20:30:13.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1436: test_mon_osd_create_destroy: ceph osd dump 2026-03-31T20:30:13.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1436: test_mon_osd_create_destroy: grep osd.4 2026-03-31T20:30:13.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1436: test_mon_osd_create_destroy: expect_false grep destroyed 2026-03-31T20:30:13.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:30:13.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep destroyed 2026-03-31T20:30:13.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:30:13.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1437: test_mon_osd_create_destroy: ceph auth get-key client.osd-lockbox.2401dbda-7b4b-4b42-a3aa-dde1d631b46a 2026-03-31T20:30:14.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1438: test_mon_osd_create_destroy: ceph auth get-key osd.4 2026-03-31T20:30:14.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1439: test_mon_osd_create_destroy: ceph config-key exists dm-crypt/osd/2401dbda-7b4b-4b42-a3aa-dde1d631b46a/luks 2026-03-31T20:30:14.833 INFO:tasks.workunit.client.0.vm03.stderr:key 'dm-crypt/osd/2401dbda-7b4b-4b42-a3aa-dde1d631b46a/luks' exists 2026-03-31T20:30:14.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1441: test_mon_osd_create_destroy: ceph osd purge-new osd.4 --yes-i-really-mean-it 2026-03-31T20:30:15.059 INFO:tasks.workunit.client.0.vm03.stderr:osd.4 does not exist 2026-03-31T20:30:15.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1442: test_mon_osd_create_destroy: expect_false ceph osd find 4 2026-03-31T20:30:15.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:30:15.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd find 4 2026-03-31T20:30:15.217 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: osd.4 does not exist 2026-03-31T20:30:15.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:30:15.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1443: test_mon_osd_create_destroy: expect_false ceph auth get-key osd.4 2026-03-31T20:30:15.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:30:15.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph auth get-key osd.4 2026-03-31T20:30:15.401 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: don't have osd.4 2026-03-31T20:30:15.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:30:15.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1444: test_mon_osd_create_destroy: expect_false ceph auth get-key client.osd-lockbox.2401dbda-7b4b-4b42-a3aa-dde1d631b46a 2026-03-31T20:30:15.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:30:15.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph auth get-key client.osd-lockbox.2401dbda-7b4b-4b42-a3aa-dde1d631b46a 2026-03-31T20:30:15.580 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: don't have client.osd-lockbox.2401dbda-7b4b-4b42-a3aa-dde1d631b46a 2026-03-31T20:30:15.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:30:15.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1445: test_mon_osd_create_destroy: expect_false ceph config-key exists dm-crypt/osd/2401dbda-7b4b-4b42-a3aa-dde1d631b46a/luks 2026-03-31T20:30:15.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:30:15.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph config-key exists dm-crypt/osd/2401dbda-7b4b-4b42-a3aa-dde1d631b46a/luks 2026-03-31T20:30:15.771 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: key 'dm-crypt/osd/2401dbda-7b4b-4b42-a3aa-dde1d631b46a/luks' doesn't exist 2026-03-31T20:30:15.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:30:15.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1446: test_mon_osd_create_destroy: ceph osd purge osd.4 --yes-i-really-mean-it 2026-03-31T20:30:15.969 INFO:tasks.workunit.client.0.vm03.stderr:osd.4 does not exist 2026-03-31T20:30:15.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1447: test_mon_osd_create_destroy: ceph osd purge-new osd.4 --yes-i-really-mean-it 2026-03-31T20:30:16.183 INFO:tasks.workunit.client.0.vm03.stderr:osd.4 does not exist 2026-03-31T20:30:16.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1449: test_mon_osd_create_destroy: ceph osd purge osd.3 --yes-i-really-mean-it 2026-03-31T20:30:16.412 INFO:tasks.workunit.client.0.vm03.stderr:osd.3 does not exist 2026-03-31T20:30:16.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1450: test_mon_osd_create_destroy: ceph osd purge 123456 --yes-i-really-mean-it 2026-03-31T20:30:16.627 INFO:tasks.workunit.client.0.vm03.stderr:osd.123456 does not exist 2026-03-31T20:30:16.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1451: test_mon_osd_create_destroy: expect_false ceph osd find 3 2026-03-31T20:30:16.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:30:16.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd find 3 2026-03-31T20:30:16.794 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: osd.3 does not exist 2026-03-31T20:30:16.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:30:16.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1452: test_mon_osd_create_destroy: expect_false ceph auth get-key osd.3 2026-03-31T20:30:16.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:30:16.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph auth get-key osd.3 2026-03-31T20:30:16.977 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: don't have osd.3 2026-03-31T20:30:16.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:30:16.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1453: test_mon_osd_create_destroy: expect_false ceph auth get-key client.osd-lockbox.4582b05f-5b20-445a-a42b-89d12b4a71ba 2026-03-31T20:30:16.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:30:16.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph auth get-key client.osd-lockbox.4582b05f-5b20-445a-a42b-89d12b4a71ba 2026-03-31T20:30:17.160 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: don't have client.osd-lockbox.4582b05f-5b20-445a-a42b-89d12b4a71ba 2026-03-31T20:30:17.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:30:17.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1454: test_mon_osd_create_destroy: expect_false ceph config-key exists dm-crypt/osd/4582b05f-5b20-445a-a42b-89d12b4a71ba/luks 2026-03-31T20:30:17.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:30:17.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph config-key exists dm-crypt/osd/4582b05f-5b20-445a-a42b-89d12b4a71ba/luks 2026-03-31T20:30:17.356 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: key 'dm-crypt/osd/4582b05f-5b20-445a-a42b-89d12b4a71ba/luks' doesn't exist 2026-03-31T20:30:17.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:30:17.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1456: test_mon_osd_create_destroy: rm /tmp/cephtool.sYl/secret.uKnSkH /tmp/cephtool.sYl/secret.VxXuZ9 /tmp/cephtool.sYl/secret.TrF5yY /tmp/cephtool.sYl/secret.e330pQ /tmp/cephtool.sYl/secret.MD27xQ /tmp/cephtool.sYl/secret.wfEQRa /tmp/cephtool.sYl/secret.6SA3j2 2026-03-31T20:30:17.362 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1459: test_mon_osd_create_destroy: ceph osd ls 2026-03-31T20:30:17.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1459: test_mon_osd_create_destroy: for i in $(ceph osd ls) 2026-03-31T20:30:17.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1460: test_mon_osd_create_destroy: [[ 0 != \3 ]] 2026-03-31T20:30:17.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1461: test_mon_osd_create_destroy: [[ 0 != \4 ]] 2026-03-31T20:30:17.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1462: test_mon_osd_create_destroy: [[ 0 != \4 ]] 2026-03-31T20:30:17.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1459: test_mon_osd_create_destroy: for i in $(ceph osd ls) 2026-03-31T20:30:17.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1460: test_mon_osd_create_destroy: [[ 1 != \3 ]] 2026-03-31T20:30:17.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1461: test_mon_osd_create_destroy: [[ 1 != \4 ]] 2026-03-31T20:30:17.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1462: test_mon_osd_create_destroy: [[ 1 != \4 ]] 2026-03-31T20:30:17.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1459: test_mon_osd_create_destroy: for i in $(ceph osd ls) 2026-03-31T20:30:17.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1460: test_mon_osd_create_destroy: [[ 2 != \3 ]] 2026-03-31T20:30:17.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1461: test_mon_osd_create_destroy: [[ 2 != \4 ]] 2026-03-31T20:30:17.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1462: test_mon_osd_create_destroy: [[ 2 != \4 ]] 2026-03-31T20:30:17.568 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1465: test_mon_osd_create_destroy: ceph osd ls 2026-03-31T20:30:17.568 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1465: test_mon_osd_create_destroy: wc -l 2026-03-31T20:30:17.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1465: test_mon_osd_create_destroy: [[ 3 == \3 ]] 2026-03-31T20:30:17.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1466: test_mon_osd_create_destroy: ceph osd setmaxosd 3 2026-03-31T20:30:19.338 INFO:tasks.workunit.client.0.vm03.stderr:set new max_osd = 3 2026-03-31T20:30:19.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:30:19.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mon_osd_pool 2026-03-31T20:30:19.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2015: test_mon_osd_pool: ceph osd pool create data 16 2026-03-31T20:30:20.408 INFO:tasks.workunit.client.0.vm03.stderr:pool 'data' already exists 2026-03-31T20:30:20.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2016: test_mon_osd_pool: ceph osd pool application enable data rados 2026-03-31T20:30:22.368 INFO:tasks.workunit.client.0.vm03.stderr:enabled application 'rados' on pool 'data' 2026-03-31T20:30:22.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2017: test_mon_osd_pool: ceph osd pool mksnap data datasnap 2026-03-31T20:30:23.423 INFO:tasks.workunit.client.0.vm03.stderr:pool data snap datasnap already exists 2026-03-31T20:30:23.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2018: test_mon_osd_pool: rados -p data lssnap 2026-03-31T20:30:23.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2018: test_mon_osd_pool: grep datasnap 2026-03-31T20:30:23.455 INFO:tasks.workunit.client.0.vm03.stdout:AQDMLsxpEs+EMxAAvtBmq8AFcT3homMC5czb8Q==AQDMLsxpYdvuMhAAsEP+qgpZMzcEVCt/nH419Q==1 datasnap 2026.03.31 20:30:22 2026-03-31T20:30:23.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2019: test_mon_osd_pool: ceph osd pool rmsnap data datasnap 2026-03-31T20:30:24.432 INFO:tasks.workunit.client.0.vm03.stderr:pool data snap datasnap does not exist 2026-03-31T20:30:24.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2020: test_mon_osd_pool: expect_false ceph osd pool rmsnap pool_fake snapshot 2026-03-31T20:30:24.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:30:24.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool rmsnap pool_fake snapshot 2026-03-31T20:30:24.602 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: unrecognized pool 'pool_fake' 2026-03-31T20:30:24.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:30:24.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2021: test_mon_osd_pool: ceph osd pool delete data data --yes-i-really-really-mean-it 2026-03-31T20:30:25.450 INFO:tasks.workunit.client.0.vm03.stderr:pool 'data' does not exist 2026-03-31T20:30:25.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2023: test_mon_osd_pool: ceph osd pool create data2 16 2026-03-31T20:30:26.449 INFO:tasks.workunit.client.0.vm03.stderr:pool 'data2' already exists 2026-03-31T20:30:26.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2024: test_mon_osd_pool: ceph osd pool application enable data2 rados 2026-03-31T20:30:28.403 INFO:tasks.workunit.client.0.vm03.stderr:enabled application 'rados' on pool 'data2' 2026-03-31T20:30:28.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2025: test_mon_osd_pool: ceph osd pool rename data2 data3 2026-03-31T20:30:29.470 INFO:tasks.workunit.client.0.vm03.stderr:pool 'data2' does not exist; pool 'data3' does -- assuming successful rename 2026-03-31T20:30:29.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2026: test_mon_osd_pool: ceph osd lspools 2026-03-31T20:30:29.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2026: test_mon_osd_pool: grep data3 2026-03-31T20:30:29.705 INFO:tasks.workunit.client.0.vm03.stdout:27 data3 2026-03-31T20:30:29.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2027: test_mon_osd_pool: ceph osd pool delete data3 data3 --yes-i-really-really-mean-it 2026-03-31T20:30:30.484 INFO:tasks.workunit.client.0.vm03.stderr:pool 'data3' does not exist 2026-03-31T20:30:30.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2029: test_mon_osd_pool: ceph osd pool create replicated 16 16 replicated 2026-03-31T20:30:31.486 INFO:tasks.workunit.client.0.vm03.stderr:pool 'replicated' already exists 2026-03-31T20:30:31.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2030: test_mon_osd_pool: ceph osd pool create replicated 1 16 replicated 2026-03-31T20:30:31.695 INFO:tasks.workunit.client.0.vm03.stderr:pool 'replicated' already exists 2026-03-31T20:30:31.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2031: test_mon_osd_pool: ceph osd pool create replicated 16 16 2026-03-31T20:30:31.899 INFO:tasks.workunit.client.0.vm03.stderr:pool 'replicated' already exists 2026-03-31T20:30:31.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2032: test_mon_osd_pool: ceph osd pool create replicated 16 2026-03-31T20:30:32.099 INFO:tasks.workunit.client.0.vm03.stderr:pool 'replicated' already exists 2026-03-31T20:30:32.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2033: test_mon_osd_pool: ceph osd pool application enable replicated rados 2026-03-31T20:30:33.452 INFO:tasks.workunit.client.0.vm03.stderr:enabled application 'rados' on pool 'replicated' 2026-03-31T20:30:33.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2035: test_mon_osd_pool: expect_false ceph osd pool create replicated 16 16 erasure 2026-03-31T20:30:33.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:30:33.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool create replicated 16 16 erasure 2026-03-31T20:30:33.612 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: pool 'replicated' cannot change to type erasure 2026-03-31T20:30:33.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:30:33.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2036: test_mon_osd_pool: ceph osd lspools 2026-03-31T20:30:33.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2036: test_mon_osd_pool: grep replicated 2026-03-31T20:30:33.821 INFO:tasks.workunit.client.0.vm03.stdout:28 replicated 2026-03-31T20:30:33.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2037: test_mon_osd_pool: ceph osd pool create ec_test 1 1 erasure 2026-03-31T20:30:34.511 INFO:tasks.workunit.client.0.vm03.stderr:pool 'ec_test' already exists 2026-03-31T20:30:34.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2038: test_mon_osd_pool: ceph osd pool application enable ec_test rados 2026-03-31T20:30:36.479 INFO:tasks.workunit.client.0.vm03.stderr:enabled application 'rados' on pool 'ec_test' 2026-03-31T20:30:36.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2039: test_mon_osd_pool: set +e 2026-03-31T20:30:36.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2040: test_mon_osd_pool: ceph osd count-metadata osd_objectstore 2026-03-31T20:30:36.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2040: test_mon_osd_pool: grep bluestore 2026-03-31T20:30:36.695 INFO:tasks.workunit.client.0.vm03.stdout: "bluestore": 3 2026-03-31T20:30:36.695 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2041: test_mon_osd_pool: '[' 0 -eq 1 ']' 2026-03-31T20:30:36.695 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2045: test_mon_osd_pool: ceph osd pool set ec_test allow_ec_overwrites true 2026-03-31T20:30:38.512 INFO:tasks.workunit.client.0.vm03.stderr:set pool 29 allow_ec_overwrites to true 2026-03-31T20:30:38.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2046: test_mon_osd_pool: expect_false ceph osd pool set ec_test allow_ec_overwrites false 2026-03-31T20:30:38.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:30:38.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set ec_test allow_ec_overwrites false 2026-03-31T20:30:38.670 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: ec overwrites cannot be disabled once enabled 2026-03-31T20:30:38.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:30:38.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2048: test_mon_osd_pool: set -e 2026-03-31T20:30:38.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2049: test_mon_osd_pool: ceph osd pool delete replicated replicated --yes-i-really-really-mean-it 2026-03-31T20:30:39.582 INFO:tasks.workunit.client.0.vm03.stderr:pool 'replicated' does not exist 2026-03-31T20:30:39.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2050: test_mon_osd_pool: ceph osd pool delete ec_test ec_test --yes-i-really-really-mean-it 2026-03-31T20:30:40.580 INFO:tasks.workunit.client.0.vm03.stderr:pool 'ec_test' does not exist 2026-03-31T20:30:40.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2053: test_mon_osd_pool: ceph osd erasure-code-profile set foo foo 2026-03-31T20:30:41.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2054: test_mon_osd_pool: ceph osd erasure-code-profile ls 2026-03-31T20:30:41.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2054: test_mon_osd_pool: grep foo 2026-03-31T20:30:41.809 INFO:tasks.workunit.client.0.vm03.stdout:foo 2026-03-31T20:30:41.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2055: test_mon_osd_pool: ceph osd crush rule create-erasure foo foo 2026-03-31T20:30:42.597 INFO:tasks.workunit.client.0.vm03.stderr:rule foo already exists 2026-03-31T20:30:42.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2056: test_mon_osd_pool: ceph osd pool create erasure 16 16 erasure foo 2026-03-31T20:30:44.616 INFO:tasks.workunit.client.0.vm03.stderr:pool 'erasure' already exists 2026-03-31T20:30:44.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2057: test_mon_osd_pool: expect_false ceph osd erasure-code-profile rm foo 2026-03-31T20:30:44.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:30:44.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd erasure-code-profile rm foo 2026-03-31T20:30:44.778 INFO:tasks.workunit.client.0.vm03.stderr:Error EBUSY: erasure pool(s) are using the erasure code profile 'foo' 2026-03-31T20:30:44.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:30:44.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2058: test_mon_osd_pool: ceph osd pool delete erasure erasure --yes-i-really-really-mean-it 2026-03-31T20:30:45.619 INFO:tasks.workunit.client.0.vm03.stderr:pool 'erasure' does not exist 2026-03-31T20:30:45.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2059: test_mon_osd_pool: ceph osd crush rule rm foo 2026-03-31T20:30:46.636 INFO:tasks.workunit.client.0.vm03.stderr:rule foo does not exist 2026-03-31T20:30:46.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2060: test_mon_osd_pool: ceph osd erasure-code-profile rm foo 2026-03-31T20:30:47.643 INFO:tasks.workunit.client.0.vm03.stderr:erasure-code-profile foo does not exist 2026-03-31T20:30:47.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2063: test_mon_osd_pool: ceph osd pool create modeon --autoscale-mode=on 2026-03-31T20:30:48.647 INFO:tasks.workunit.client.0.vm03.stderr:pool 'modeon' already exists 2026-03-31T20:30:48.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2064: test_mon_osd_pool: ceph osd dump 2026-03-31T20:30:48.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2064: test_mon_osd_pool: grep modeon 2026-03-31T20:30:48.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2064: test_mon_osd_pool: grep 'autoscale_mode on' 2026-03-31T20:30:48.876 INFO:tasks.workunit.client.0.vm03.stdout:pool 31 'modeon' replicated size 2 min_size 1 crush_rule 0 object_hash rjenkins pg_num 1 pgp_num 1 autoscale_mode on last_change 406 flags hashpspool,creating stripe_width 0 read_balance_score 2.99 2026-03-31T20:30:48.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2065: test_mon_osd_pool: ceph osd pool create modewarn --autoscale-mode=warn 2026-03-31T20:30:49.661 INFO:tasks.workunit.client.0.vm03.stderr:pool 'modewarn' already exists 2026-03-31T20:30:49.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2066: test_mon_osd_pool: ceph osd dump 2026-03-31T20:30:49.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2066: test_mon_osd_pool: grep modewarn 2026-03-31T20:30:49.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2066: test_mon_osd_pool: grep 'autoscale_mode warn' 2026-03-31T20:30:49.893 INFO:tasks.workunit.client.0.vm03.stdout:pool 32 'modewarn' replicated size 2 min_size 1 crush_rule 0 object_hash rjenkins pg_num 32 pgp_num 32 autoscale_mode warn last_change 407 flags hashpspool,creating stripe_width 0 read_balance_score 1.50 2026-03-31T20:30:49.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2067: test_mon_osd_pool: ceph osd pool create modeoff --autoscale-mode=off 2026-03-31T20:30:50.666 INFO:tasks.workunit.client.0.vm03.stderr:pool 'modeoff' already exists 2026-03-31T20:30:50.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2068: test_mon_osd_pool: ceph osd dump 2026-03-31T20:30:50.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2068: test_mon_osd_pool: grep modeoff 2026-03-31T20:30:50.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2068: test_mon_osd_pool: grep 'autoscale_mode off' 2026-03-31T20:30:50.902 INFO:tasks.workunit.client.0.vm03.stdout:pool 33 'modeoff' replicated size 2 min_size 1 crush_rule 0 object_hash rjenkins pg_num 32 pgp_num 32 autoscale_mode off last_change 408 flags hashpspool,creating stripe_width 0 read_balance_score 1.69 2026-03-31T20:30:50.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2069: test_mon_osd_pool: ceph osd pool delete modeon modeon --yes-i-really-really-mean-it 2026-03-31T20:30:51.665 INFO:tasks.workunit.client.0.vm03.stderr:pool 'modeon' does not exist 2026-03-31T20:30:51.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2070: test_mon_osd_pool: ceph osd pool delete modewarn modewarn --yes-i-really-really-mean-it 2026-03-31T20:30:52.678 INFO:tasks.workunit.client.0.vm03.stderr:pool 'modewarn' does not exist 2026-03-31T20:30:52.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2071: test_mon_osd_pool: ceph osd pool delete modeoff modeoff --yes-i-really-really-mean-it 2026-03-31T20:30:53.707 INFO:tasks.workunit.client.0.vm03.stderr:pool 'modeoff' does not exist 2026-03-31T20:30:53.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:30:53.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mon_osd_pool_quota 2026-03-31T20:30:53.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2081: test_mon_osd_pool_quota: ceph osd pool create tmp-quota-pool 32 2026-03-31T20:30:54.724 INFO:tasks.workunit.client.0.vm03.stderr:pool 'tmp-quota-pool' already exists 2026-03-31T20:30:54.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2082: test_mon_osd_pool_quota: ceph osd pool application enable tmp-quota-pool rados 2026-03-31T20:30:56.683 INFO:tasks.workunit.client.0.vm03.stderr:enabled application 'rados' on pool 'tmp-quota-pool' 2026-03-31T20:30:56.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2086: test_mon_osd_pool_quota: expect_false ceph osd pool set-quota tmp-quota-pool max_fooness 10 2026-03-31T20:30:56.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:30:56.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set-quota tmp-quota-pool max_fooness 10 2026-03-31T20:30:56.862 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: max_fooness not in max_objects|max_bytes 2026-03-31T20:30:56.862 INFO:tasks.workunit.client.0.vm03.stderr:osd pool set-quota : set object or byte limit on pool 2026-03-31T20:30:56.862 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:30:56.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:30:56.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2087: test_mon_osd_pool_quota: expect_false ceph osd pool set-quota tmp-quota-pool max_bytes -1 2026-03-31T20:30:56.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:30:56.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set-quota tmp-quota-pool max_bytes -1 2026-03-31T20:30:57.022 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: error parsing value '-1': strict_iecstrtoll: value should not be negative 2026-03-31T20:30:57.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:30:57.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2088: test_mon_osd_pool_quota: expect_false ceph osd pool set-quota tmp-quota-pool max_objects aaa 2026-03-31T20:30:57.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:30:57.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set-quota tmp-quota-pool max_objects aaa 2026-03-31T20:30:57.187 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: error parsing value 'aaa': strict_si_cast: unit prefix not recognized 2026-03-31T20:30:57.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:30:57.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2092: test_mon_osd_pool_quota: ceph osd pool set-quota tmp-quota-pool max_bytes 10 2026-03-31T20:30:58.694 INFO:tasks.workunit.client.0.vm03.stderr:set-quota max_bytes = 10 for pool tmp-quota-pool 2026-03-31T20:30:58.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2093: test_mon_osd_pool_quota: ceph osd pool set-quota tmp-quota-pool max_objects 10M 2026-03-31T20:31:00.720 INFO:tasks.workunit.client.0.vm03.stderr:set-quota max_objects = 10000000 for pool tmp-quota-pool 2026-03-31T20:31:00.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2097: test_mon_osd_pool_quota: ceph osd pool get-quota tmp-quota-pool --format=json-pretty 2026-03-31T20:31:00.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2098: test_mon_osd_pool_quota: grep '"quota_max_objects":.*10000000' 2026-03-31T20:31:00.955 INFO:tasks.workunit.client.0.vm03.stdout: "quota_max_objects": 10000000, 2026-03-31T20:31:00.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2099: test_mon_osd_pool_quota: ceph osd pool get-quota tmp-quota-pool --format=json-pretty 2026-03-31T20:31:00.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2100: test_mon_osd_pool_quota: grep '"quota_max_bytes":.*10' 2026-03-31T20:31:01.185 INFO:tasks.workunit.client.0.vm03.stdout: "quota_max_bytes": 10, 2026-03-31T20:31:01.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2104: test_mon_osd_pool_quota: ceph osd pool get-quota tmp-quota-pool 2026-03-31T20:31:01.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2104: test_mon_osd_pool_quota: grep 'max bytes.*10 B' 2026-03-31T20:31:01.407 INFO:tasks.workunit.client.0.vm03.stdout: max bytes : 10 B (current num bytes: 0 bytes) 2026-03-31T20:31:01.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2105: test_mon_osd_pool_quota: ceph osd pool get-quota tmp-quota-pool 2026-03-31T20:31:01.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2105: test_mon_osd_pool_quota: grep 'max objects.*10.*M objects' 2026-03-31T20:31:01.626 INFO:tasks.workunit.client.0.vm03.stdout: max objects: 10M objects (current num objects: 0 objects) 2026-03-31T20:31:01.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2109: test_mon_osd_pool_quota: ceph osd pool set-quota tmp-quota-pool max_bytes 10K 2026-03-31T20:31:03.755 INFO:tasks.workunit.client.0.vm03.stderr:set-quota max_bytes = 10240 for pool tmp-quota-pool 2026-03-31T20:31:03.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2113: test_mon_osd_pool_quota: ceph osd pool get-quota tmp-quota-pool 2026-03-31T20:31:03.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2113: test_mon_osd_pool_quota: grep 'max bytes.*10 Ki' 2026-03-31T20:31:03.987 INFO:tasks.workunit.client.0.vm03.stdout: max bytes : 10 KiB (current num bytes: 0 bytes) 2026-03-31T20:31:03.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2117: test_mon_osd_pool_quota: ceph osd pool set-quota tmp-quota-pool max_bytes 10Ki 2026-03-31T20:31:05.771 INFO:tasks.workunit.client.0.vm03.stderr:set-quota max_bytes = 10240 for pool tmp-quota-pool 2026-03-31T20:31:05.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2121: test_mon_osd_pool_quota: ceph osd pool get-quota tmp-quota-pool 2026-03-31T20:31:05.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2121: test_mon_osd_pool_quota: grep 'max bytes.*10 Ki' 2026-03-31T20:31:06.008 INFO:tasks.workunit.client.0.vm03.stdout: max bytes : 10 KiB (current num bytes: 0 bytes) 2026-03-31T20:31:06.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2126: test_mon_osd_pool_quota: ceph osd pool set-quota tmp-quota-pool max_bytes 0 2026-03-31T20:31:07.784 INFO:tasks.workunit.client.0.vm03.stderr:set-quota max_bytes = 0 for pool tmp-quota-pool 2026-03-31T20:31:07.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2127: test_mon_osd_pool_quota: ceph osd pool set-quota tmp-quota-pool max_objects 0 2026-03-31T20:31:09.828 INFO:tasks.workunit.client.0.vm03.stderr:set-quota max_objects = 0 for pool tmp-quota-pool 2026-03-31T20:31:09.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2131: test_mon_osd_pool_quota: ceph osd pool get-quota tmp-quota-pool 2026-03-31T20:31:09.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2131: test_mon_osd_pool_quota: grep 'max bytes.*N/A' 2026-03-31T20:31:10.070 INFO:tasks.workunit.client.0.vm03.stdout: max bytes : N/A 2026-03-31T20:31:10.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2132: test_mon_osd_pool_quota: ceph osd pool get-quota tmp-quota-pool 2026-03-31T20:31:10.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2132: test_mon_osd_pool_quota: grep 'max objects.*N/A' 2026-03-31T20:31:10.291 INFO:tasks.workunit.client.0.vm03.stdout: max objects: N/A 2026-03-31T20:31:10.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2135: test_mon_osd_pool_quota: ceph osd pool delete tmp-quota-pool tmp-quota-pool --yes-i-really-really-mean-it 2026-03-31T20:31:10.909 INFO:tasks.workunit.client.0.vm03.stderr:pool 'tmp-quota-pool' does not exist 2026-03-31T20:31:10.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:31:11.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mon_pg 2026-03-31T20:31:11.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2141: test_mon_pg: wait_for_health_ok 2026-03-31T20:31:11.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1855: wait_for_health_ok: wait_for_health HEALTH_OK 2026-03-31T20:31:11.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1834: wait_for_health: local grepstr=HEALTH_OK 2026-03-31T20:31:11.141 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: get_timeout_delays 300 .1 2026-03-31T20:31:11.142 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: shopt -q -o xtrace 2026-03-31T20:31:11.142 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: echo true 2026-03-31T20:31:11.142 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: local trace=true 2026-03-31T20:31:11.142 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: true 2026-03-31T20:31:11.142 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: shopt -u -o xtrace 2026-03-31T20:31:11.312 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '4.5') 2026-03-31T20:31:11.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: local -a delays 2026-03-31T20:31:11.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1836: wait_for_health: local -i loop=0 2026-03-31T20:31:11.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:11.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep HEALTH_OK 2026-03-31T20:31:11.601 INFO:tasks.workunit.client.0.vm03.stdout:HEALTH_OK 2026-03-31T20:31:11.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2143: test_mon_pg: ceph pg debug unfound_objects_exist 2026-03-31T20:31:11.804 INFO:tasks.workunit.client.0.vm03.stdout:FALSE 2026-03-31T20:31:11.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2144: test_mon_pg: ceph pg debug degraded_pgs_exist 2026-03-31T20:31:12.019 INFO:tasks.workunit.client.0.vm03.stdout:FALSE 2026-03-31T20:31:12.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2145: test_mon_pg: ceph pg deep-scrub 1.0 2026-03-31T20:31:12.231 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 1.0 on osd.1 to deep-scrub 2026-03-31T20:31:12.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2146: test_mon_pg: ceph pg dump 2026-03-31T20:31:12.440 INFO:tasks.workunit.client.0.vm03.stdout:version 457 2026-03-31T20:31:12.440 INFO:tasks.workunit.client.0.vm03.stdout:stamp 2026-03-31T20:31:11.657815+0000 2026-03-31T20:31:12.440 INFO:tasks.workunit.client.0.vm03.stdout:last_osdmap_epoch 0 2026-03-31T20:31:12.441 INFO:tasks.workunit.client.0.vm03.stdout:last_pg_scan 0 2026-03-31T20:31:12.441 INFO:tasks.workunit.client.0.vm03.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-31T20:31:12.441 INFO:tasks.workunit.client.0.vm03.stdout:1.0 2 0 0 0 0 459280 0 0 41 0 41 active+clean 2026-03-31T20:29:15.918391+0000 172'41 425:989 [1,2] 1 [1,2] 1 172'41 2026-03-31T20:26:36.459878+0000 172'41 2026-03-31T20:26:29.486746+0000 0 0 periodic scrub scheduled @ 2026-04-02T08:18:10.345705+0000 2 0 2026-03-31T20:31:12.441 INFO:tasks.workunit.client.0.vm03.stdout:2.3 1 0 0 0 0 0 0 0 2 0 2 active+clean 2026-03-31T20:26:37.430938+0000 172'2 425:870 [1,2] 1 [1,2] 1 172'2 2026-03-31T20:26:37.430905+0000 172'2 2026-03-31T20:26:31.466479+0000 0 0 periodic scrub scheduled @ 2026-04-02T04:16:40.214034+0000 1 0 2026-03-31T20:31:12.441 INFO:tasks.workunit.client.0.vm03.stdout:2.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-31T20:26:33.059521+0000 0'0 425:860 [2,1] 2 [2,1] 2 0'0 2026-03-31T20:26:33.059473+0000 0'0 2026-03-31T20:26:31.028583+0000 0 0 periodic scrub scheduled @ 2026-04-02T05:48:20.649520+0000 0 0 2026-03-31T20:31:12.441 INFO:tasks.workunit.client.0.vm03.stdout:2.1 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-31T20:26:34.047630+0000 0'0 425:860 [2,1] 2 [2,1] 2 0'0 2026-03-31T20:26:34.047600+0000 0'0 2026-03-31T20:26:32.053167+0000 0 0 periodic scrub scheduled @ 2026-04-02T07:14:32.884696+0000 0 0 2026-03-31T20:31:12.441 INFO:tasks.workunit.client.0.vm03.stdout:2.2 1 0 0 0 0 19 0 0 3 0 3 active+clean 2026-03-31T20:29:18.927326+0000 172'3 425:785 [0,1] 0 [0,1] 0 172'3 2026-03-31T20:26:31.889790+0000 172'3 2026-03-31T20:26:30.871302+0000 0 0 periodic scrub scheduled @ 2026-04-02T04:32:06.896389+0000 1 0 2026-03-31T20:31:12.441 INFO:tasks.workunit.client.0.vm03.stdout:2.4 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-31T20:29:18.927905+0000 0'0 425:891 [1,0] 1 [1,0] 1 0'0 2026-03-31T20:26:39.414769+0000 0'0 2026-03-31T20:26:33.464380+0000 0 0 periodic scrub scheduled @ 2026-04-02T01:40:05.887464+0000 0 0 2026-03-31T20:31:12.441 INFO:tasks.workunit.client.0.vm03.stdout:2.5 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-31T20:29:15.912780+0000 0'0 425:892 [1,2] 1 [1,2] 1 0'0 2026-03-31T20:26:38.458279+0000 0'0 2026-03-31T20:26:32.494594+0000 0 0 periodic scrub scheduled @ 2026-04-01T22:25:49.071901+0000 0 0 2026-03-31T20:31:12.441 INFO:tasks.workunit.client.0.vm03.stdout:2.6 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-31T20:29:15.912481+0000 0'0 425:892 [1,2] 1 [1,2] 1 0'0 2026-03-31T20:26:41.404628+0000 0'0 2026-03-31T20:26:35.469461+0000 0 0 periodic scrub scheduled @ 2026-04-02T03:20:25.566652+0000 0 0 2026-03-31T20:31:12.441 INFO:tasks.workunit.client.0.vm03.stdout:2.7 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-31T20:29:15.912306+0000 0'0 425:892 [1,2] 1 [1,2] 1 0'0 2026-03-31T20:26:40.426768+0000 0'0 2026-03-31T20:26:34.453503+0000 0 0 periodic scrub scheduled @ 2026-04-02T02:54:10.033462+0000 0 0 2026-03-31T20:31:12.441 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:31:12.441 INFO:tasks.workunit.client.0.vm03.stdout:2 2 0 0 0 0 19 0 0 5 5 2026-03-31T20:31:12.441 INFO:tasks.workunit.client.0.vm03.stdout:1 2 0 0 0 0 459280 0 0 41 41 2026-03-31T20:31:12.441 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:31:12.441 INFO:tasks.workunit.client.0.vm03.stdout:sum 4 0 0 0 0 459299 0 0 46 46 2026-03-31T20:31:12.441 INFO:tasks.workunit.client.0.vm03.stdout:OSD_STAT USED AVAIL USED_RAW TOTAL HB_PEERS PG_SUM PRIMARY_PG_SUM 2026-03-31T20:31:12.441 INFO:tasks.workunit.client.0.vm03.stdout:2 30 MiB 90 GiB 30 MiB 90 GiB [0,1] 7 2 2026-03-31T20:31:12.441 INFO:tasks.workunit.client.0.vm03.stdout:1 30 MiB 90 GiB 30 MiB 90 GiB [0,2] 9 6 2026-03-31T20:31:12.441 INFO:tasks.workunit.client.0.vm03.stdout:0 30 MiB 90 GiB 30 MiB 90 GiB [1,2] 2 1 2026-03-31T20:31:12.441 INFO:tasks.workunit.client.0.vm03.stdout:sum 89 MiB 270 GiB 89 MiB 270 GiB 2026-03-31T20:31:12.441 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:31:12.441 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-31T20:31:12.441 INFO:tasks.workunit.client.0.vm03.stderr:dumped all 2026-03-31T20:31:12.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2147: test_mon_pg: ceph pg dump pgs_brief --format=json 2026-03-31T20:31:12.647 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:31:12.647 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs_brief 2026-03-31T20:31:12.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2148: test_mon_pg: ceph pg dump pgs --format=json 2026-03-31T20:31:12.857 INFO:tasks.workunit.client.0.vm03.stdout:{"pg_ready":true,"pg_stats":[{"pgid":"1.0","state":"active+clean","up":[1,2],"acting":[1,2],"up_primary":1,"acting_primary":1},{"pgid":"2.3","state":"active+clean","up":[1,2],"acting":[1,2],"up_primary":1,"acting_primary":1},{"pgid":"2.0","state":"active+clean","up":[2,1],"acting":[2,1],"up_primary":2,"acting_primary":2},{"pgid":"2.1","state":"active+clean","up":[2,1],"acting":[2,1],"up_primary":2,"acting_primary":2},{"pgid":"2.2","state":"active+clean","up":[0,1],"acting":[0,1],"up_primary":0,"acting_primary":0},{"pgid":"2.4","state":"active+clean","up":[1,0],"acting":[1,0],"up_primary":1,"acting_primary":1},{"pgid":"2.5","state":"active+clean","up":[1,2],"acting":[1,2],"up_primary":1,"acting_primary":1},{"pgid":"2.6","state":"active+clean","up":[1,2],"acting":[1,2],"up_primary":1,"acting_primary":1},{"pgid":"2.7","state":"active+clean","up":[1,2],"acting":[1,2],"up_primary":1,"acting_primary":1}]} 2026-03-31T20:31:12.857 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-31T20:31:12.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2149: test_mon_pg: ceph pg dump pools --format=json 2026-03-31T20:31:13.071 INFO:tasks.workunit.client.0.vm03.stdout:{"pg_ready":true,"pg_stats":[{"pgid":"1.0","version":"172'41","reported_seq":989,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.825149+0000","last_change":"2026-03-31T20:29:15.918391+0000","last_active":"2026-03-31T20:31:08.825149+0000","last_peered":"2026-03-31T20:31:08.825149+0000","last_clean":"2026-03-31T20:31:08.825149+0000","last_became_active":"2026-03-31T20:29:15.913130+0000","last_became_peered":"2026-03-31T20:29:15.913130+0000","last_unstale":"2026-03-31T20:31:08.825149+0000","last_undegraded":"2026-03-31T20:31:08.825149+0000","last_fullsized":"2026-03-31T20:31:08.825149+0000","mapping_epoch":327,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":328,"parent":"0.0","parent_split_bits":0,"last_scrub":"172'41","last_scrub_stamp":"2026-03-31T20:26:36.459878+0000","last_deep_scrub":"172'41","last_deep_scrub_stamp":"2026-03-31T20:26:29.486746+0000","last_clean_scrub_stamp":"2026-03-31T20:26:36.459878+0000","objects_scrubbed":2,"log_size":41,"log_dups_size":0,"ondisk_log_size":41,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T08:18:10.345705+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":106,"num_read_kb":213,"num_write":69,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":6,"num_bytes_recovered":1377840,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.3","version":"172'2","reported_seq":870,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.825177+0000","last_change":"2026-03-31T20:26:37.430938+0000","last_active":"2026-03-31T20:31:08.825177+0000","last_peered":"2026-03-31T20:31:08.825177+0000","last_clean":"2026-03-31T20:31:08.825177+0000","last_became_active":"2026-03-31T20:21:33.038753+0000","last_became_peered":"2026-03-31T20:21:33.038753+0000","last_unstale":"2026-03-31T20:31:08.825177+0000","last_undegraded":"2026-03-31T20:31:08.825177+0000","last_fullsized":"2026-03-31T20:31:08.825177+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"172'2","last_scrub_stamp":"2026-03-31T20:26:37.430905+0000","last_deep_scrub":"172'2","last_deep_scrub_stamp":"2026-03-31T20:26:31.466479+0000","last_clean_scrub_stamp":"2026-03-31T20:26:37.430905+0000","objects_scrubbed":1,"log_size":2,"log_dups_size":0,"ondisk_log_size":2,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T04:16:40.214034+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00057155900000000002,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.0","version":"0'0","reported_seq":860,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.825836+0000","last_change":"2026-03-31T20:26:33.059521+0000","last_active":"2026-03-31T20:31:08.825836+0000","last_peered":"2026-03-31T20:31:08.825836+0000","last_clean":"2026-03-31T20:31:08.825836+0000","last_became_active":"2026-03-31T20:21:33.036655+0000","last_became_peered":"2026-03-31T20:21:33.036655+0000","last_unstale":"2026-03-31T20:31:08.825836+0000","last_undegraded":"2026-03-31T20:31:08.825836+0000","last_fullsized":"2026-03-31T20:31:08.825836+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:33.059473+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:31.028583+0000","last_clean_scrub_stamp":"2026-03-31T20:26:33.059473+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T05:48:20.649520+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00028135699999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.1","version":"0'0","reported_seq":860,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.826068+0000","last_change":"2026-03-31T20:26:34.047630+0000","last_active":"2026-03-31T20:31:08.826068+0000","last_peered":"2026-03-31T20:31:08.826068+0000","last_clean":"2026-03-31T20:31:08.826068+0000","last_became_active":"2026-03-31T20:21:33.036761+0000","last_became_peered":"2026-03-31T20:21:33.036761+0000","last_unstale":"2026-03-31T20:31:08.826068+0000","last_undegraded":"2026-03-31T20:31:08.826068+0000","last_fullsized":"2026-03-31T20:31:08.826068+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:34.047600+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:32.053167+0000","last_clean_scrub_stamp":"2026-03-31T20:26:34.047600+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T07:14:32.884696+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00033548699999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.2","version":"172'3","reported_seq":785,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.825965+0000","last_change":"2026-03-31T20:29:18.927326+0000","last_active":"2026-03-31T20:31:08.825965+0000","last_peered":"2026-03-31T20:31:08.825965+0000","last_clean":"2026-03-31T20:31:08.825965+0000","last_became_active":"2026-03-31T20:29:18.927209+0000","last_became_peered":"2026-03-31T20:29:18.927209+0000","last_unstale":"2026-03-31T20:31:08.825965+0000","last_undegraded":"2026-03-31T20:31:08.825965+0000","last_fullsized":"2026-03-31T20:31:08.825965+0000","mapping_epoch":330,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":331,"parent":"0.0","parent_split_bits":0,"last_scrub":"172'3","last_scrub_stamp":"2026-03-31T20:26:31.889790+0000","last_deep_scrub":"172'3","last_deep_scrub_stamp":"2026-03-31T20:26:30.871302+0000","last_clean_scrub_stamp":"2026-03-31T20:26:31.889790+0000","objects_scrubbed":1,"log_size":3,"log_dups_size":0,"ondisk_log_size":3,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T04:32:06.896389+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00026154900000000003,"stat_sum":{"num_bytes":19,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":2,"num_bytes_recovered":38,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.4","version":"0'0","reported_seq":891,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.825189+0000","last_change":"2026-03-31T20:29:18.927905+0000","last_active":"2026-03-31T20:31:08.825189+0000","last_peered":"2026-03-31T20:31:08.825189+0000","last_clean":"2026-03-31T20:31:08.825189+0000","last_became_active":"2026-03-31T20:29:18.927814+0000","last_became_peered":"2026-03-31T20:29:18.927814+0000","last_unstale":"2026-03-31T20:31:08.825189+0000","last_undegraded":"2026-03-31T20:31:08.825189+0000","last_fullsized":"2026-03-31T20:31:08.825189+0000","mapping_epoch":330,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":331,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:39.414769+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:33.464380+0000","last_clean_scrub_stamp":"2026-03-31T20:26:39.414769+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T01:40:05.887464+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00049229099999999995,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.5","version":"0'0","reported_seq":892,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.829888+0000","last_change":"2026-03-31T20:29:15.912780+0000","last_active":"2026-03-31T20:31:08.829888+0000","last_peered":"2026-03-31T20:31:08.829888+0000","last_clean":"2026-03-31T20:31:08.829888+0000","last_became_active":"2026-03-31T20:29:15.912702+0000","last_became_peered":"2026-03-31T20:29:15.912702+0000","last_unstale":"2026-03-31T20:31:08.829888+0000","last_undegraded":"2026-03-31T20:31:08.829888+0000","last_fullsized":"2026-03-31T20:31:08.829888+0000","mapping_epoch":327,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":328,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:38.458279+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:32.494594+0000","last_clean_scrub_stamp":"2026-03-31T20:26:38.458279+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-01T22:25:49.071901+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00042708900000000002,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.6","version":"0'0","reported_seq":892,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.825239+0000","last_change":"2026-03-31T20:29:15.912481+0000","last_active":"2026-03-31T20:31:08.825239+0000","last_peered":"2026-03-31T20:31:08.825239+0000","last_clean":"2026-03-31T20:31:08.825239+0000","last_became_active":"2026-03-31T20:29:15.912185+0000","last_became_peered":"2026-03-31T20:29:15.912185+0000","last_unstale":"2026-03-31T20:31:08.825239+0000","last_undegraded":"2026-03-31T20:31:08.825239+0000","last_fullsized":"2026-03-31T20:31:08.825239+0000","mapping_epoch":327,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":328,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:41.404628+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:35.469461+0000","last_clean_scrub_stamp":"2026-03-31T20:26:41.404628+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T03:20:25.566652+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00038476,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.7","version":"0'0","reported_seq":892,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.825262+0000","last_change":"2026-03-31T20:29:15.912306+0000","last_active":"2026-03-31T20:31:08.825262+0000","last_peered":"2026-03-31T20:31:08.825262+0000","last_clean":"2026-03-31T20:31:08.825262+0000","last_became_active":"2026-03-31T20:29:15.912193+0000","last_became_peered":"2026-03-31T20:29:15.912193+0000","last_unstale":"2026-03-31T20:31:08.825262+0000","last_undegraded":"2026-03-31T20:31:08.825262+0000","last_fullsized":"2026-03-31T20:31:08.825262+0000","mapping_epoch":327,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":328,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:40.426768+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:34.453503+0000","last_clean_scrub_stamp":"2026-03-31T20:26:40.426768+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T02:54:10.033462+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00037397899999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}]} 2026-03-31T20:31:13.071 INFO:tasks.workunit.client.0.vm03.stderr:dumped pools 2026-03-31T20:31:13.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2150: test_mon_pg: ceph pg dump osds --format=json 2026-03-31T20:31:13.288 INFO:tasks.workunit.client.0.vm03.stdout:{"pg_ready":true,"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":2,"num_bytes_recovered":38,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":38,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":7162,"internal_metadata":0},"log_size":5,"ondisk_log_size":5,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":106,"num_read_kb":213,"num_write":69,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":6,"num_bytes_recovered":1377840,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":471040,"data_stored":918560,"data_compressed":9580,"data_compressed_allocated":466944,"data_compressed_original":914464,"omap_allocated":2560,"internal_metadata":0},"log_size":41,"ondisk_log_size":41,"up":2,"acting":2,"num_store_stats":3}]} 2026-03-31T20:31:13.288 INFO:tasks.workunit.client.0.vm03.stderr:dumped osds 2026-03-31T20:31:13.302 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2151: test_mon_pg: ceph pg dump sum --format=json 2026-03-31T20:31:13.504 INFO:tasks.workunit.client.0.vm03.stdout:{"pg_ready":true,"osd_stats":[{"osd":2,"up_from":8,"seq":34359738492,"num_pgs":33,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":30620,"kb_used_data":3788,"kb_used_omap":269,"kb_used_meta":26546,"kb_avail":94341220,"statfs":{"total":96636764160,"available":96605409280,"internally_reserved":0,"allocated":3878912,"data_stored":1954068,"data_compressed":15522,"data_compressed_allocated":290816,"data_compressed_original":562243,"omap_allocated":276118,"internal_metadata":27183466},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]},{"osd":1,"up_from":8,"seq":34359738492,"num_pgs":35,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":30628,"kb_used_data":3796,"kb_used_omap":242,"kb_used_meta":26573,"kb_avail":94341212,"statfs":{"total":96636764160,"available":96605401088,"internally_reserved":0,"allocated":3887104,"data_stored":1954087,"data_compressed":15550,"data_compressed_allocated":290816,"data_compressed_original":558147,"omap_allocated":248041,"internal_metadata":27211543},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]},{"osd":0,"up_from":221,"seq":949187772540,"num_pgs":14,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":30396,"kb_used_data":3564,"kb_used_omap":270,"kb_used_meta":26545,"kb_avail":94341444,"statfs":{"total":96636764160,"available":96605638656,"internally_reserved":0,"allocated":3649536,"data_stored":1494807,"data_compressed":10746,"data_compressed_allocated":57344,"data_compressed_original":102963,"omap_allocated":276512,"internal_metadata":27183072},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":237568,"data_stored":459280,"data_compressed":4804,"data_compressed_allocated":233472,"data_compressed_original":455184,"omap_allocated":2560,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":233472,"data_stored":459280,"data_compressed":4776,"data_compressed_allocated":233472,"data_compressed_original":459280,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":3130,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1971,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":2061,"internal_metadata":0}]} 2026-03-31T20:31:13.504 INFO:tasks.workunit.client.0.vm03.stderr:dumped sum 2026-03-31T20:31:13.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2152: test_mon_pg: ceph pg dump all --format=json 2026-03-31T20:31:13.726 INFO:tasks.workunit.client.0.vm03.stdout:{"pg_ready":true,"pg_map":{"version":457,"stamp":"2026-03-31T20:31:11.657815+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459299,"num_objects":4,"num_object_clones":0,"num_object_copies":8,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":4,"num_whiteouts":0,"num_read":106,"num_read_kb":213,"num_write":71,"num_write_kb":586,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":8,"num_bytes_recovered":1377878,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":46,"ondisk_log_size":46,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":82,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":91644,"kb_used_data":11148,"kb_used_omap":781,"kb_used_meta":79666,"kb_avail":283023876,"statfs":{"total":289910292480,"available":289816449024,"internally_reserved":0,"allocated":11415552,"data_stored":5402962,"data_compressed":41818,"data_compressed_allocated":638976,"data_compressed_original":1223353,"omap_allocated":800671,"internal_metadata":81578081},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":3,"apply_latency_ms":3,"commit_latency_ns":3000000,"apply_latency_ns":3000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"4.000624"}}} 2026-03-31T20:31:13.726 INFO:tasks.workunit.client.0.vm03.stderr:dumped all 2026-03-31T20:31:13.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2153: test_mon_pg: ceph pg dump pgs_brief osds --format=json 2026-03-31T20:31:13.948 INFO:tasks.workunit.client.0.vm03.stdout:{"pg_ready":true,"pg_map":{"version":458,"stamp":"2026-03-31T20:31:13.658101+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459299,"num_objects":4,"num_object_clones":0,"num_object_copies":8,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":4,"num_whiteouts":0,"num_read":106,"num_read_kb":213,"num_write":71,"num_write_kb":586,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":8,"num_bytes_recovered":1377878,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":46,"ondisk_log_size":46,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":82,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":91644,"kb_used_data":11148,"kb_used_omap":781,"kb_used_meta":79666,"kb_avail":283023876,"statfs":{"total":289910292480,"available":289816449024,"internally_reserved":0,"allocated":11415552,"data_stored":5402962,"data_compressed":41818,"data_compressed_allocated":638976,"data_compressed_original":1223353,"omap_allocated":800671,"internal_metadata":81578081},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":3,"apply_latency_ms":3,"commit_latency_ns":3000000,"apply_latency_ns":3000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"5.878849"},"pg_stats":[{"pgid":"1.0","version":"172'41","reported_seq":989,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.825149+0000","last_change":"2026-03-31T20:29:15.918391+0000","last_active":"2026-03-31T20:31:08.825149+0000","last_peered":"2026-03-31T20:31:08.825149+0000","last_clean":"2026-03-31T20:31:08.825149+0000","last_became_active":"2026-03-31T20:29:15.913130+0000","last_became_peered":"2026-03-31T20:29:15.913130+0000","last_unstale":"2026-03-31T20:31:08.825149+0000","last_undegraded":"2026-03-31T20:31:08.825149+0000","last_fullsized":"2026-03-31T20:31:08.825149+0000","mapping_epoch":327,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":328,"parent":"0.0","parent_split_bits":0,"last_scrub":"172'41","last_scrub_stamp":"2026-03-31T20:26:36.459878+0000","last_deep_scrub":"172'41","last_deep_scrub_stamp":"2026-03-31T20:26:29.486746+0000","last_clean_scrub_stamp":"2026-03-31T20:26:36.459878+0000","objects_scrubbed":2,"log_size":41,"log_dups_size":0,"ondisk_log_size":41,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T08:18:10.345705+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":106,"num_read_kb":213,"num_write":69,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":6,"num_bytes_recovered":1377840,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.3","version":"172'2","reported_seq":870,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.825177+0000","last_change":"2026-03-31T20:26:37.430938+0000","last_active":"2026-03-31T20:31:08.825177+0000","last_peered":"2026-03-31T20:31:08.825177+0000","last_clean":"2026-03-31T20:31:08.825177+0000","last_became_active":"2026-03-31T20:21:33.038753+0000","last_became_peered":"2026-03-31T20:21:33.038753+0000","last_unstale":"2026-03-31T20:31:08.825177+0000","last_undegraded":"2026-03-31T20:31:08.825177+0000","last_fullsized":"2026-03-31T20:31:08.825177+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"172'2","last_scrub_stamp":"2026-03-31T20:26:37.430905+0000","last_deep_scrub":"172'2","last_deep_scrub_stamp":"2026-03-31T20:26:31.466479+0000","last_clean_scrub_stamp":"2026-03-31T20:26:37.430905+0000","objects_scrubbed":1,"log_size":2,"log_dups_size":0,"ondisk_log_size":2,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T04:16:40.214034+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00057155900000000002,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.0","version":"0'0","reported_seq":860,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.825836+0000","last_change":"2026-03-31T20:26:33.059521+0000","last_active":"2026-03-31T20:31:08.825836+0000","last_peered":"2026-03-31T20:31:08.825836+0000","last_clean":"2026-03-31T20:31:08.825836+0000","last_became_active":"2026-03-31T20:21:33.036655+0000","last_became_peered":"2026-03-31T20:21:33.036655+0000","last_unstale":"2026-03-31T20:31:08.825836+0000","last_undegraded":"2026-03-31T20:31:08.825836+0000","last_fullsized":"2026-03-31T20:31:08.825836+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:33.059473+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:31.028583+0000","last_clean_scrub_stamp":"2026-03-31T20:26:33.059473+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T05:48:20.649520+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00028135699999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.1","version":"0'0","reported_seq":860,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.826068+0000","last_change":"2026-03-31T20:26:34.047630+0000","last_active":"2026-03-31T20:31:08.826068+0000","last_peered":"2026-03-31T20:31:08.826068+0000","last_clean":"2026-03-31T20:31:08.826068+0000","last_became_active":"2026-03-31T20:21:33.036761+0000","last_became_peered":"2026-03-31T20:21:33.036761+0000","last_unstale":"2026-03-31T20:31:08.826068+0000","last_undegraded":"2026-03-31T20:31:08.826068+0000","last_fullsized":"2026-03-31T20:31:08.826068+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:34.047600+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:32.053167+0000","last_clean_scrub_stamp":"2026-03-31T20:26:34.047600+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T07:14:32.884696+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00033548699999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.2","version":"172'3","reported_seq":785,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.825965+0000","last_change":"2026-03-31T20:29:18.927326+0000","last_active":"2026-03-31T20:31:08.825965+0000","last_peered":"2026-03-31T20:31:08.825965+0000","last_clean":"2026-03-31T20:31:08.825965+0000","last_became_active":"2026-03-31T20:29:18.927209+0000","last_became_peered":"2026-03-31T20:29:18.927209+0000","last_unstale":"2026-03-31T20:31:08.825965+0000","last_undegraded":"2026-03-31T20:31:08.825965+0000","last_fullsized":"2026-03-31T20:31:08.825965+0000","mapping_epoch":330,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":331,"parent":"0.0","parent_split_bits":0,"last_scrub":"172'3","last_scrub_stamp":"2026-03-31T20:26:31.889790+0000","last_deep_scrub":"172'3","last_deep_scrub_stamp":"2026-03-31T20:26:30.871302+0000","last_clean_scrub_stamp":"2026-03-31T20:26:31.889790+0000","objects_scrubbed":1,"log_size":3,"log_dups_size":0,"ondisk_log_size":3,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T04:32:06.896389+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00026154900000000003,"stat_sum":{"num_bytes":19,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":2,"num_bytes_recovered":38,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.4","version":"0'0","reported_seq":891,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.825189+0000","last_change":"2026-03-31T20:29:18.927905+0000","last_active":"2026-03-31T20:31:08.825189+0000","last_peered":"2026-03-31T20:31:08.825189+0000","last_clean":"2026-03-31T20:31:08.825189+0000","last_became_active":"2026-03-31T20:29:18.927814+0000","last_became_peered":"2026-03-31T20:29:18.927814+0000","last_unstale":"2026-03-31T20:31:08.825189+0000","last_undegraded":"2026-03-31T20:31:08.825189+0000","last_fullsized":"2026-03-31T20:31:08.825189+0000","mapping_epoch":330,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":331,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:39.414769+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:33.464380+0000","last_clean_scrub_stamp":"2026-03-31T20:26:39.414769+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T01:40:05.887464+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00049229099999999995,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.5","version":"0'0","reported_seq":892,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.829888+0000","last_change":"2026-03-31T20:29:15.912780+0000","last_active":"2026-03-31T20:31:08.829888+0000","last_peered":"2026-03-31T20:31:08.829888+0000","last_clean":"2026-03-31T20:31:08.829888+0000","last_became_active":"2026-03-31T20:29:15.912702+0000","last_became_peered":"2026-03-31T20:29:15.912702+0000","last_unstale":"2026-03-31T20:31:08.829888+0000","last_undegraded":"2026-03-31T20:31:08.829888+0000","last_fullsized":"2026-03-31T20:31:08.829888+0000","mapping_epoch":327,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":328,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:38.458279+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:32.494594+0000","last_clean_scrub_stamp":"2026-03-31T20:26:38.458279+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-01T22:25:49.071901+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00042708900000000002,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.6","version":"0'0","reported_seq":892,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.825239+0000","last_change":"2026-03-31T20:29:15.912481+0000","last_active":"2026-03-31T20:31:08.825239+0000","last_peered":"2026-03-31T20:31:08.825239+0000","last_clean":"2026-03-31T20:31:08.825239+0000","last_became_active":"2026-03-31T20:29:15.912185+0000","last_became_peered":"2026-03-31T20:29:15.912185+0000","last_unstale":"2026-03-31T20:31:08.825239+0000","last_undegraded":"2026-03-31T20:31:08.825239+0000","last_fullsized":"2026-03-31T20:31:08.825239+0000","mapping_epoch":327,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":328,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:41.404628+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:35.469461+0000","last_clean_scrub_stamp":"2026-03-31T20:26:41.404628+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T03:20:25.566652+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00038476,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.7","version":"0'0","reported_seq":892,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.825262+0000","last_change":"2026-03-31T20:29:15.912306+0000","last_active":"2026-03-31T20:31:08.825262+0000","last_peered":"2026-03-31T20:31:08.825262+0000","last_clean":"2026-03-31T20:31:08.825262+0000","last_became_active":"2026-03-31T20:29:15.912193+0000","last_became_peered":"2026-03-31T20:29:15.912193+0000","last_unstale":"2026-03-31T20:31:08.825262+0000","last_undegraded":"2026-03-31T20:31:08.825262+0000","last_fullsized":"2026-03-31T20:31:08.825262+0000","mapping_epoch":327,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":328,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:40.426768+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:34.453503+0000","last_clean_scrub_stamp":"2026-03-31T20:26:40.426768+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T02:54:10.033462+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00037397899999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":2,"num_bytes_recovered":38,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":38,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":7162,"internal_metadata":0},"log_size":5,"ondisk_log_size":5,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":106,"num_read_kb":213,"num_write":69,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":6,"num_bytes_recovered":1377840,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":471040,"data_stored":918560,"data_compressed":9580,"data_compressed_allocated":466944,"data_compressed_original":914464,"omap_allocated":2560,"internal_metadata":0},"log_size":41,"ondisk_log_size":41,"up":2,"acting":2,"num_store_stats":3}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359738492,"num_pgs":33,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":30620,"kb_used_data":3788,"kb_used_omap":269,"kb_used_meta":26546,"kb_avail":94341220,"statfs":{"total":96636764160,"available":96605409280,"internally_reserved":0,"allocated":3878912,"data_stored":1954068,"data_compressed":15522,"data_compressed_allocated":290816,"data_compressed_original":562243,"omap_allocated":276118,"internal_metadata":27183466},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]},{"osd":1,"up_from":8,"seq":34359738492,"num_pgs":35,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":30628,"kb_used_data":3796,"kb_used_omap":242,"kb_used_meta":26573,"kb_avail":94341212,"statfs":{"total":96636764160,"available":96605401088,"internally_reserved":0,"allocated":3887104,"data_stored":1954087,"data_compressed":15550,"data_compressed_allocated":290816,"data_compressed_original":558147,"omap_allocated":248041,"internal_metadata":27211543},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]},{"osd":0,"up_from":221,"seq":949187772540,"num_pgs":14,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":30396,"kb_used_data":3564,"kb_used_omap":270,"kb_used_meta":26545,"kb_avail":94341444,"statfs":{"total":96636764160,"available":96605638656,"internally_reserved":0,"allocated":3649536,"data_stored":1494807,"data_compressed":10746,"data_compressed_allocated":57344,"data_compressed_original":102963,"omap_allocated":276512,"internal_metadata":27183072},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":237568,"data_stored":459280,"data_compressed":4804,"data_compressed_allocated":233472,"data_compressed_original":455184,"omap_allocated":2560,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":233472,"data_stored":459280,"data_compressed":4776,"data_compressed_allocated":233472,"data_compressed_original":459280,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":3130,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1971,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":2061,"internal_metadata":0}]}} 2026-03-31T20:31:13.948 INFO:tasks.workunit.client.0.vm03.stderr:dumped osds,pgs_brief 2026-03-31T20:31:13.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2154: test_mon_pg: ceph pg dump pools osds pgs_brief --format=json 2026-03-31T20:31:14.164 INFO:tasks.workunit.client.0.vm03.stdout:{"pg_ready":true,"osd_stats":[{"osd":2,"up_from":8,"seq":34359738492,"num_pgs":33,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":30620,"kb_used_data":3788,"kb_used_omap":269,"kb_used_meta":26546,"kb_avail":94341220,"statfs":{"total":96636764160,"available":96605409280,"internally_reserved":0,"allocated":3878912,"data_stored":1954068,"data_compressed":15522,"data_compressed_allocated":290816,"data_compressed_original":562243,"omap_allocated":276118,"internal_metadata":27183466},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]},{"osd":1,"up_from":8,"seq":34359738492,"num_pgs":35,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":30628,"kb_used_data":3796,"kb_used_omap":242,"kb_used_meta":26573,"kb_avail":94341212,"statfs":{"total":96636764160,"available":96605401088,"internally_reserved":0,"allocated":3887104,"data_stored":1954087,"data_compressed":15550,"data_compressed_allocated":290816,"data_compressed_original":558147,"omap_allocated":248041,"internal_metadata":27211543},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]},{"osd":0,"up_from":221,"seq":949187772540,"num_pgs":14,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":30396,"kb_used_data":3564,"kb_used_omap":270,"kb_used_meta":26545,"kb_avail":94341444,"statfs":{"total":96636764160,"available":96605638656,"internally_reserved":0,"allocated":3649536,"data_stored":1494807,"data_compressed":10746,"data_compressed_allocated":57344,"data_compressed_original":102963,"omap_allocated":276512,"internal_metadata":27183072},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":237568,"data_stored":459280,"data_compressed":4804,"data_compressed_allocated":233472,"data_compressed_original":455184,"omap_allocated":2560,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":233472,"data_stored":459280,"data_compressed":4776,"data_compressed_allocated":233472,"data_compressed_original":459280,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":3130,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1971,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":2061,"internal_metadata":0}],"pg_stats":[{"pgid":"1.0","state":"active+clean","up":[1,2],"acting":[1,2],"up_primary":1,"acting_primary":1},{"pgid":"2.3","state":"active+clean","up":[1,2],"acting":[1,2],"up_primary":1,"acting_primary":1},{"pgid":"2.0","state":"active+clean","up":[2,1],"acting":[2,1],"up_primary":2,"acting_primary":2},{"pgid":"2.1","state":"active+clean","up":[2,1],"acting":[2,1],"up_primary":2,"acting_primary":2},{"pgid":"2.2","state":"active+clean","up":[0,1],"acting":[0,1],"up_primary":0,"acting_primary":0},{"pgid":"2.4","state":"active+clean","up":[1,0],"acting":[1,0],"up_primary":1,"acting_primary":1},{"pgid":"2.5","state":"active+clean","up":[1,2],"acting":[1,2],"up_primary":1,"acting_primary":1},{"pgid":"2.6","state":"active+clean","up":[1,2],"acting":[1,2],"up_primary":1,"acting_primary":1},{"pgid":"2.7","state":"active+clean","up":[1,2],"acting":[1,2],"up_primary":1,"acting_primary":1}]} 2026-03-31T20:31:14.165 INFO:tasks.workunit.client.0.vm03.stderr:dumped osds,pgs_brief,pools 2026-03-31T20:31:14.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2155: test_mon_pg: ceph pg dump_json 2026-03-31T20:31:14.379 INFO:tasks.workunit.client.0.vm03.stderr:dumped all 2026-03-31T20:31:14.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2156: test_mon_pg: ceph pg dump_pools_json 2026-03-31T20:31:14.593 INFO:tasks.workunit.client.0.vm03.stderr:dumped pools 2026-03-31T20:31:14.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2157: test_mon_pg: ceph pg dump_stuck inactive 2026-03-31T20:31:14.809 INFO:tasks.workunit.client.0.vm03.stderr:ok 2026-03-31T20:31:14.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2158: test_mon_pg: ceph pg dump_stuck unclean 2026-03-31T20:31:15.041 INFO:tasks.workunit.client.0.vm03.stderr:ok 2026-03-31T20:31:15.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2159: test_mon_pg: ceph pg dump_stuck stale 2026-03-31T20:31:15.260 INFO:tasks.workunit.client.0.vm03.stderr:ok 2026-03-31T20:31:15.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2160: test_mon_pg: ceph pg dump_stuck undersized 2026-03-31T20:31:15.477 INFO:tasks.workunit.client.0.vm03.stderr:ok 2026-03-31T20:31:15.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2161: test_mon_pg: ceph pg dump_stuck degraded 2026-03-31T20:31:15.699 INFO:tasks.workunit.client.0.vm03.stderr:ok 2026-03-31T20:31:15.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2162: test_mon_pg: ceph pg ls 2026-03-31T20:31:15.922 INFO:tasks.workunit.client.0.vm03.stdout:{"pg_ready":true,"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":2,"num_bytes_recovered":38,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":38,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":7162,"internal_metadata":0},"log_size":5,"ondisk_log_size":5,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":106,"num_read_kb":213,"num_write":69,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":6,"num_bytes_recovered":1377840,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":471040,"data_stored":918560,"data_compressed":9580,"data_compressed_allocated":466944,"data_compressed_original":914464,"omap_allocated":2560,"internal_metadata":0},"log_size":41,"ondisk_log_size":41,"up":2,"acting":2,"num_store_stats":3}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359738492,"num_pgs":33,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":30620,"kb_used_data":3788,"kb_used_omap":269,"kb_used_meta":26546,"kb_avail":94341220,"statfs":{"total":96636764160,"available":96605409280,"internally_reserved":0,"allocated":3878912,"data_stored":1954068,"data_compressed":15522,"data_compressed_allocated":290816,"data_compressed_original":562243,"omap_allocated":276118,"internal_metadata":27183466},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]},{"osd":1,"up_from":8,"seq":34359738492,"num_pgs":35,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":30628,"kb_used_data":3796,"kb_used_omap":242,"kb_used_meta":26573,"kb_avail":94341212,"statfs":{"total":96636764160,"available":96605401088,"internally_reserved":0,"allocated":3887104,"data_stored":1954087,"data_compressed":15550,"data_compressed_allocated":290816,"data_compressed_original":558147,"omap_allocated":248041,"internal_metadata":27211543},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]},{"osd":0,"up_from":221,"seq":949187772540,"num_pgs":14,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":30396,"kb_used_data":3564,"kb_used_omap":270,"kb_used_meta":26545,"kb_avail":94341444,"statfs":{"total":96636764160,"available":96605638656,"internally_reserved":0,"allocated":3649536,"data_stored":1494807,"data_compressed":10746,"data_compressed_allocated":57344,"data_compressed_original":102963,"omap_allocated":276512,"internal_metadata":27183072},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":237568,"data_stored":459280,"data_compressed":4804,"data_compressed_allocated":233472,"data_compressed_original":455184,"omap_allocated":2560,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":233472,"data_stored":459280,"data_compressed":4776,"data_compressed_allocated":233472,"data_compressed_original":459280,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":3130,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1971,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":2061,"internal_metadata":0}],"pg_stats":[{"pgid":"1.0","state":"active+clean","up":[1,2],"acting":[1,2],"up_primary":1,"acting_primary":1},{"pgid":"2.3","state":"active+clean","up":[1,2],"acting":[1,2],"up_primary":1,"acting_primary":1},{"pgid":"2.0","state":"active+clean","up":[2,1],"acting":[2,1],"up_primary":2,"acting_primary":2},{"pgid":"2.1","state":"active+clean","up":[2,1],"acting":[2,1],"up_primary":2,"acting_primary":2},{"pgid":"2.2","state":"active+clean","up":[0,1],"acting":[0,1],"up_primary":0,"acting_primary":0},{"pgid":"2.4","state":"active+clean","up":[1,0],"acting":[1,0],"up_primary":1,"acting_primary":1},{"pgid":"2.5","state":"active+clean","up":[1,2],"acting":[1,2],"up_primary":1,"acting_primary":1},{"pgid":"2.6","state":"active+clean","up":[1,2],"acting":[1,2],"up_primary":1,"acting_primary":1},{"pgid":"2.7","state":"active+clean","up":[1,2],"acting":[1,2],"up_primary":1,"acting_primary":1}]}{"pg_ready":true,"pg_map":{"version":458,"stamp":"2026-03-31T20:31:13.658101+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459299,"num_objects":4,"num_object_clones":0,"num_object_copies":8,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":4,"num_whiteouts":0,"num_read":106,"num_read_kb":213,"num_write":71,"num_write_kb":586,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":8,"num_bytes_recovered":1377878,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":46,"ondisk_log_size":46,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":82,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":91644,"kb_used_data":11148,"kb_used_omap":781,"kb_used_meta":79666,"kb_avail":283023876,"statfs":{"total":289910292480,"available":289816449024,"internally_reserved":0,"allocated":11415552,"data_stored":5402962,"data_compressed":41818,"data_compressed_allocated":638976,"data_compressed_original":1223353,"omap_allocated":800671,"internal_metadata":81578081},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":3,"apply_latency_ms":3,"commit_latency_ns":3000000,"apply_latency_ns":3000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"5.878849"},"pg_stats":[{"pgid":"1.0","version":"172'41","reported_seq":989,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.825149+0000","last_change":"2026-03-31T20:29:15.918391+0000","last_active":"2026-03-31T20:31:08.825149+0000","last_peered":"2026-03-31T20:31:08.825149+0000","last_clean":"2026-03-31T20:31:08.825149+0000","last_became_active":"2026-03-31T20:29:15.913130+0000","last_became_peered":"2026-03-31T20:29:15.913130+0000","last_unstale":"2026-03-31T20:31:08.825149+0000","last_undegraded":"2026-03-31T20:31:08.825149+0000","last_fullsized":"2026-03-31T20:31:08.825149+0000","mapping_epoch":327,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":328,"parent":"0.0","parent_split_bits":0,"last_scrub":"172'41","last_scrub_stamp":"2026-03-31T20:26:36.459878+0000","last_deep_scrub":"172'41","last_deep_scrub_stamp":"2026-03-31T20:26:29.486746+0000","last_clean_scrub_stamp":"2026-03-31T20:26:36.459878+0000","objects_scrubbed":2,"log_size":41,"log_dups_size":0,"ondisk_log_size":41,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T08:18:10.345705+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":106,"num_read_kb":213,"num_write":69,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":6,"num_bytes_recovered":1377840,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.3","version":"172'2","reported_seq":870,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.825177+0000","last_change":"2026-03-31T20:26:37.430938+0000","last_active":"2026-03-31T20:31:08.825177+0000","last_peered":"2026-03-31T20:31:08.825177+0000","last_clean":"2026-03-31T20:31:08.825177+0000","last_became_active":"2026-03-31T20:21:33.038753+0000","last_became_peered":"2026-03-31T20:21:33.038753+0000","last_unstale":"2026-03-31T20:31:08.825177+0000","last_undegraded":"2026-03-31T20:31:08.825177+0000","last_fullsized":"2026-03-31T20:31:08.825177+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"172'2","last_scrub_stamp":"2026-03-31T20:26:37.430905+0000","last_deep_scrub":"172'2","last_deep_scrub_stamp":"2026-03-31T20:26:31.466479+0000","last_clean_scrub_stamp":"2026-03-31T20:26:37.430905+0000","objects_scrubbed":1,"log_size":2,"log_dups_size":0,"ondisk_log_size":2,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T04:16:40.214034+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00057155900000000002,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.0","version":"0'0","reported_seq":860,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.825836+0000","last_change":"2026-03-31T20:26:33.059521+0000","last_active":"2026-03-31T20:31:08.825836+0000","last_peered":"2026-03-31T20:31:08.825836+0000","last_clean":"2026-03-31T20:31:08.825836+0000","last_became_active":"2026-03-31T20:21:33.036655+0000","last_became_peered":"2026-03-31T20:21:33.036655+0000","last_unstale":"2026-03-31T20:31:08.825836+0000","last_undegraded":"2026-03-31T20:31:08.825836+0000","last_fullsized":"2026-03-31T20:31:08.825836+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:33.059473+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:31.028583+0000","last_clean_scrub_stamp":"2026-03-31T20:26:33.059473+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T05:48:20.649520+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00028135699999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.1","version":"0'0","reported_seq":860,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.826068+0000","last_change":"2026-03-31T20:26:34.047630+0000","last_active":"2026-03-31T20:31:08.826068+0000","last_peered":"2026-03-31T20:31:08.826068+0000","last_clean":"2026-03-31T20:31:08.826068+0000","last_became_active":"2026-03-31T20:21:33.036761+0000","last_became_peered":"2026-03-31T20:21:33.036761+0000","last_unstale":"2026-03-31T20:31:08.826068+0000","last_undegraded":"2026-03-31T20:31:08.826068+0000","last_fullsized":"2026-03-31T20:31:08.826068+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:34.047600+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:32.053167+0000","last_clean_scrub_stamp":"2026-03-31T20:26:34.047600+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T07:14:32.884696+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00033548699999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.2","version":"172'3","reported_seq":785,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.825965+0000","last_change":"2026-03-31T20:29:18.927326+0000","last_active":"2026-03-31T20:31:08.825965+0000","last_peered":"2026-03-31T20:31:08.825965+0000","last_clean":"2026-03-31T20:31:08.825965+0000","last_became_active":"2026-03-31T20:29:18.927209+0000","last_became_peered":"2026-03-31T20:29:18.927209+0000","last_unstale":"2026-03-31T20:31:08.825965+0000","last_undegraded":"2026-03-31T20:31:08.825965+0000","last_fullsized":"2026-03-31T20:31:08.825965+0000","mapping_epoch":330,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":331,"parent":"0.0","parent_split_bits":0,"last_scrub":"172'3","last_scrub_stamp":"2026-03-31T20:26:31.889790+0000","last_deep_scrub":"172'3","last_deep_scrub_stamp":"2026-03-31T20:26:30.871302+0000","last_clean_scrub_stamp":"2026-03-31T20:26:31.889790+0000","objects_scrubbed":1,"log_size":3,"log_dups_size":0,"ondisk_log_size":3,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T04:32:06.896389+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00026154900000000003,"stat_sum":{"num_bytes":19,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":2,"num_bytes_recovered":38,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.4","version":"0'0","reported_seq":891,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.825189+0000","last_change":"2026-03-31T20:29:18.927905+0000","last_active":"2026-03-31T20:31:08.825189+0000","last_peered":"2026-03-31T20:31:08.825189+0000","last_clean":"2026-03-31T20:31:08.825189+0000","last_became_active":"2026-03-31T20:29:18.927814+0000","last_became_peered":"2026-03-31T20:29:18.927814+0000","last_unstale":"2026-03-31T20:31:08.825189+0000","last_undegraded":"2026-03-31T20:31:08.825189+0000","last_fullsized":"2026-03-31T20:31:08.825189+0000","mapping_epoch":330,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":331,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:39.414769+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:33.464380+0000","last_clean_scrub_stamp":"2026-03-31T20:26:39.414769+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T01:40:05.887464+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00049229099999999995,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,0],"acting":[1,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.5","version":"0'0","reported_seq":892,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.829888+0000","last_change":"2026-03-31T20:29:15.912780+0000","last_active":"2026-03-31T20:31:08.829888+0000","last_peered":"2026-03-31T20:31:08.829888+0000","last_clean":"2026-03-31T20:31:08.829888+0000","last_became_active":"2026-03-31T20:29:15.912702+0000","last_became_peered":"2026-03-31T20:29:15.912702+0000","last_unstale":"2026-03-31T20:31:08.829888+0000","last_undegraded":"2026-03-31T20:31:08.829888+0000","last_fullsized":"2026-03-31T20:31:08.829888+0000","mapping_epoch":327,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":328,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:38.458279+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:32.494594+0000","last_clean_scrub_stamp":"2026-03-31T20:26:38.458279+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-01T22:25:49.071901+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00042708900000000002,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.6","version":"0'0","reported_seq":892,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.825239+0000","last_change":"2026-03-31T20:29:15.912481+0000","last_active":"2026-03-31T20:31:08.825239+0000","last_peered":"2026-03-31T20:31:08.825239+0000","last_clean":"2026-03-31T20:31:08.825239+0000","last_became_active":"2026-03-31T20:29:15.912185+0000","last_became_peered":"2026-03-31T20:29:15.912185+0000","last_unstale":"2026-03-31T20:31:08.825239+0000","last_undegraded":"2026-03-31T20:31:08.825239+0000","last_fullsized":"2026-03-31T20:31:08.825239+0000","mapping_epoch":327,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":328,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:41.404628+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:35.469461+0000","last_clean_scrub_stamp":"2026-03-31T20:26:41.404628+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T03:20:25.566652+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00038476,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]},{"pgid":"2.7","version":"0'0","reported_seq":892,"reported_epoch":425,"state":"active+clean","last_fresh":"2026-03-31T20:31:08.825262+0000","last_change":"2026-03-31T20:29:15.912306+0000","last_active":"2026-03-31T20:31:08.825262+0000","last_peered":"2026-03-31T20:31:08.825262+0000","last_clean":"2026-03-31T20:31:08.825262+0000","last_became_active":"2026-03-31T20:29:15.912193+0000","last_became_peered":"2026-03-31T20:29:15.912193+0000","last_unstale":"2026-03-31T20:31:08.825262+0000","last_undegraded":"2026-03-31T20:31:08.825262+0000","last_fullsized":"2026-03-31T20:31:08.825262+0000","mapping_epoch":327,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":328,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:40.426768+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:34.453503+0000","last_clean_scrub_stamp":"2026-03-31T20:26:40.426768+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T02:54:10.033462+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00037397899999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[1,2],"acting":[1,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":1,"acting_primary":1,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":2,"num_bytes_recovered":38,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":38,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":7162,"internal_metadata":0},"log_size":5,"ondisk_log_size":5,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":106,"num_read_kb":213,"num_write":69,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":6,"num_bytes_recovered":1377840,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":471040,"data_stored":918560,"data_compressed":9580,"data_compressed_allocated":466944,"data_compressed_original":914464,"omap_allocated":2560,"internal_metadata":0},"log_size":41,"ondisk_log_size":41,"up":2,"acting":2,"num_store_stats":3}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359738492,"num_pgs":33,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":30620,"kb_used_data":3788,"kb_used_omap":269,"kb_used_meta":26546,"kb_avail":94341220,"statfs":{"total":96636764160,"available":96605409280,"internally_reserved":0,"allocated":3878912,"data_stored":1954068,"data_compressed":15522,"data_compressed_allocated":290816,"data_compressed_original":562243,"omap_allocated":276118,"internal_metadata":27183466},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]},{"osd":1,"up_from":8,"seq":34359738492,"num_pgs":35,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":30628,"kb_used_data":3796,"kb_used_omap":242,"kb_used_meta":26573,"kb_avail":94341212,"statfs":{"total":96636764160,"available":96605401088,"internally_reserved":0,"allocated":3887104,"data_stored":1954087,"data_compressed":15550,"data_compressed_allocated":290816,"data_compressed_original":558147,"omap_allocated":248041,"internal_metadata":27211543},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]},{"osd":0,"up_from":221,"seq":949187772540,"num_pgs":14,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":30396,"kb_used_data":3564,"kb_used_omap":270,"kb_used_meta":26545,"kb_avail":94341444,"statfs":{"total":96636764160,"available":96605638656,"internally_reserved":0,"allocated":3649536,"data_stored":1494807,"data_compressed":10746,"data_compressed_allocated":57344,"data_compressed_original":102963,"omap_allocated":276512,"internal_metadata":27183072},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":237568,"data_stored":459280,"data_compressed":4804,"data_compressed_allocated":233472,"data_compressed_original":455184,"omap_allocated":2560,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":233472,"data_stored":459280,"data_compressed":4776,"data_compressed_allocated":233472,"data_compressed_original":459280,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":3130,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":1971,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":2061,"internal_metadata":0}]}}{"pg_ready":true,"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":2,"num_bytes_recovered":38,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":38,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":7162,"internal_metadata":0},"log_size":5,"ondisk_log_size":5,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":106,"num_read_kb":213,"num_write":69,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":6,"num_bytes_recovered":1377840,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":471040,"data_stored":918560,"data_compressed":9580,"data_compressed_allocated":466944,"data_compressed_original":914464,"omap_allocated":2560,"internal_metadata":0},"log_size":41,"ondisk_log_size":41,"up":2,"acting":2,"num_store_stats":3}]}PG OBJECTS DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS STATE SINCE VERSION REPORTED UP ACTING SCRUB_STAMP DEEP_SCRUB_STAMP LAST_SCRUB_DURATION SCRUB_SCHEDULING 2026-03-31T20:31:15.922 INFO:tasks.workunit.client.0.vm03.stdout:1.0 2 0 0 0 459280 0 0 41 0 active+clean 3s 172'41 427:1001 [1,2]p1 [1,2]p1 2026-03-31T20:31:12.319938+0000 2026-03-31T20:31:12.319938+0000 0 periodic scrub scheduled @ 2026-04-02T07:44:17.231283+0000 2026-03-31T20:31:15.922 INFO:tasks.workunit.client.0.vm03.stdout:2.0 0 0 0 0 0 0 0 0 0 active+clean 4m 0'0 427:864 [2,1]p2 [2,1]p2 2026-03-31T20:26:33.059473+0000 2026-03-31T20:26:31.028583+0000 0 periodic scrub scheduled @ 2026-04-02T05:48:20.649520+0000 2026-03-31T20:31:15.922 INFO:tasks.workunit.client.0.vm03.stdout:2.1 0 0 0 0 0 0 0 0 0 active+clean 4m 0'0 427:864 [2,1]p2 [2,1]p2 2026-03-31T20:26:34.047600+0000 2026-03-31T20:26:32.053167+0000 0 periodic scrub scheduled @ 2026-04-02T07:14:32.884696+0000 2026-03-31T20:31:15.922 INFO:tasks.workunit.client.0.vm03.stdout:2.2 1 0 0 0 19 0 0 3 0 active+clean 116s 172'3 427:789 [0,1]p0 [0,1]p0 2026-03-31T20:26:31.889790+0000 2026-03-31T20:26:30.871302+0000 0 periodic scrub scheduled @ 2026-04-02T04:32:06.896389+0000 2026-03-31T20:31:15.922 INFO:tasks.workunit.client.0.vm03.stdout:2.3 1 0 0 0 0 0 0 2 0 active+clean 4m 172'2 427:874 [1,2]p1 [1,2]p1 2026-03-31T20:26:37.430905+0000 2026-03-31T20:26:31.466479+0000 0 periodic scrub scheduled @ 2026-04-02T04:16:40.214034+0000 2026-03-31T20:31:15.922 INFO:tasks.workunit.client.0.vm03.stdout:2.4 0 0 0 0 0 0 0 0 0 active+clean 116s 0'0 427:895 [1,0]p1 [1,0]p1 2026-03-31T20:26:39.414769+0000 2026-03-31T20:26:33.464380+0000 0 periodic scrub scheduled @ 2026-04-02T01:40:05.887464+0000 2026-03-31T20:31:15.922 INFO:tasks.workunit.client.0.vm03.stdout:2.5 0 0 0 0 0 0 0 0 0 active+clean 2m 0'0 427:896 [1,2]p1 [1,2]p1 2026-03-31T20:26:38.458279+0000 2026-03-31T20:26:32.494594+0000 0 periodic scrub scheduled @ 2026-04-01T22:25:49.071901+0000 2026-03-31T20:31:15.922 INFO:tasks.workunit.client.0.vm03.stdout:2.6 0 0 0 0 0 0 0 0 0 active+clean 2m 0'0 427:896 [1,2]p1 [1,2]p1 2026-03-31T20:26:41.404628+0000 2026-03-31T20:26:35.469461+0000 0 periodic scrub scheduled @ 2026-04-02T03:20:25.566652+0000 2026-03-31T20:31:15.922 INFO:tasks.workunit.client.0.vm03.stdout:2.7 0 0 0 0 0 0 0 0 0 active+clean 2m 0'0 427:896 [1,2]p1 [1,2]p1 2026-03-31T20:26:40.426768+0000 2026-03-31T20:26:34.453503+0000 0 periodic scrub scheduled @ 2026-04-02T02:54:10.033462+0000 2026-03-31T20:31:15.922 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:31:15.922 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-31T20:31:15.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2163: test_mon_pg: ceph pg ls 1 2026-03-31T20:31:16.134 INFO:tasks.workunit.client.0.vm03.stdout:PG OBJECTS DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS STATE SINCE VERSION REPORTED UP ACTING SCRUB_STAMP DEEP_SCRUB_STAMP LAST_SCRUB_DURATION SCRUB_SCHEDULING 2026-03-31T20:31:16.134 INFO:tasks.workunit.client.0.vm03.stdout:1.0 2 0 0 0 459280 0 0 41 0 active+clean 3s 172'41 427:1001 [1,2]p1 [1,2]p1 2026-03-31T20:31:12.319938+0000 2026-03-31T20:31:12.319938+0000 0 periodic scrub scheduled @ 2026-04-02T07:44:17.231283+0000 2026-03-31T20:31:16.134 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:31:16.134 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-31T20:31:16.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2164: test_mon_pg: ceph pg ls stale 2026-03-31T20:31:16.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2165: test_mon_pg: expect_false ceph pg ls scrubq 2026-03-31T20:31:16.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:31:16.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph pg ls scrubq 2026-03-31T20:31:16.512 INFO:tasks.ceph.mgr.x.vm03.stderr:2026-03-31T20:31:16.507+0000 7f026e475640 -1 mgr.server reply reply (22) Invalid argument 'scrubq' is not a valid pg state, available choices: stale+creating+active+activating+clean+recovery_wait+recovery_toofull+recovering+forced_recovery+down+recovery_unfound+backfill_unfound+undersized+degraded+remapped+premerge+scrubbing+deep+inconsistent+peering+repair+backfill_wait+backfilling+forced_backfill+backfill_toofull+incomplete+peered+snaptrim+snaptrim_wait+snaptrim_error 2026-03-31T20:31:16.512 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: 'scrubq' is not a valid pg state, available choices: stale+creating+active+activating+clean+recovery_wait+recovery_toofull+recovering+forced_recovery+down+recovery_unfound+backfill_unfound+undersized+degraded+remapped+premerge+scrubbing+deep+inconsistent+peering+repair+backfill_wait+backfilling+forced_backfill+backfill_toofull+incomplete+peered+snaptrim+snaptrim_wait+snaptrim_error 2026-03-31T20:31:16.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:31:16.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2166: test_mon_pg: ceph pg ls active stale repair recovering 2026-03-31T20:31:16.719 INFO:tasks.workunit.client.0.vm03.stdout:PG OBJECTS DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS STATE SINCE VERSION REPORTED UP ACTING SCRUB_STAMP DEEP_SCRUB_STAMP LAST_SCRUB_DURATION SCRUB_SCHEDULING 2026-03-31T20:31:16.719 INFO:tasks.workunit.client.0.vm03.stdout:1.0 2 0 0 0 459280 0 0 41 0 active+clean 4s 172'41 427:1001 [1,2]p1 [1,2]p1 2026-03-31T20:31:12.319938+0000 2026-03-31T20:31:12.319938+0000 0 periodic scrub scheduled @ 2026-04-02T07:44:17.231283+0000 2026-03-31T20:31:16.719 INFO:tasks.workunit.client.0.vm03.stdout:2.0 0 0 0 0 0 0 0 0 0 active+clean 4m 0'0 427:864 [2,1]p2 [2,1]p2 2026-03-31T20:26:33.059473+0000 2026-03-31T20:26:31.028583+0000 0 periodic scrub scheduled @ 2026-04-02T05:48:20.649520+0000 2026-03-31T20:31:16.719 INFO:tasks.workunit.client.0.vm03.stdout:2.1 0 0 0 0 0 0 0 0 0 active+clean 4m 0'0 427:864 [2,1]p2 [2,1]p2 2026-03-31T20:26:34.047600+0000 2026-03-31T20:26:32.053167+0000 0 periodic scrub scheduled @ 2026-04-02T07:14:32.884696+0000 2026-03-31T20:31:16.719 INFO:tasks.workunit.client.0.vm03.stdout:2.2 1 0 0 0 19 0 0 3 0 active+clean 117s 172'3 427:789 [0,1]p0 [0,1]p0 2026-03-31T20:26:31.889790+0000 2026-03-31T20:26:30.871302+0000 0 periodic scrub scheduled @ 2026-04-02T04:32:06.896389+0000 2026-03-31T20:31:16.719 INFO:tasks.workunit.client.0.vm03.stdout:2.3 1 0 0 0 0 0 0 2 0 active+clean 4m 172'2 427:874 [1,2]p1 [1,2]p1 2026-03-31T20:26:37.430905+0000 2026-03-31T20:26:31.466479+0000 0 periodic scrub scheduled @ 2026-04-02T04:16:40.214034+0000 2026-03-31T20:31:16.719 INFO:tasks.workunit.client.0.vm03.stdout:2.4 0 0 0 0 0 0 0 0 0 active+clean 117s 0'0 427:895 [1,0]p1 [1,0]p1 2026-03-31T20:26:39.414769+0000 2026-03-31T20:26:33.464380+0000 0 periodic scrub scheduled @ 2026-04-02T01:40:05.887464+0000 2026-03-31T20:31:16.719 INFO:tasks.workunit.client.0.vm03.stdout:2.5 0 0 0 0 0 0 0 0 0 active+clean 2m 0'0 427:896 [1,2]p1 [1,2]p1 2026-03-31T20:26:38.458279+0000 2026-03-31T20:26:32.494594+0000 0 periodic scrub scheduled @ 2026-04-01T22:25:49.071901+0000 2026-03-31T20:31:16.719 INFO:tasks.workunit.client.0.vm03.stdout:2.6 0 0 0 0 0 0 0 0 0 active+clean 2m 0'0 427:896 [1,2]p1 [1,2]p1 2026-03-31T20:26:41.404628+0000 2026-03-31T20:26:35.469461+0000 0 periodic scrub scheduled @ 2026-04-02T03:20:25.566652+0000 2026-03-31T20:31:16.719 INFO:tasks.workunit.client.0.vm03.stdout:2.7 0 0 0 0 0 0 0 0 0 active+clean 2m 0'0 427:896 [1,2]p1 [1,2]p1 2026-03-31T20:26:40.426768+0000 2026-03-31T20:26:34.453503+0000 0 periodic scrub scheduled @ 2026-04-02T02:54:10.033462+0000 2026-03-31T20:31:16.719 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:31:16.719 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-31T20:31:16.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2167: test_mon_pg: ceph pg ls 1 active 2026-03-31T20:31:16.937 INFO:tasks.workunit.client.0.vm03.stdout:PG OBJECTS DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS STATE SINCE VERSION REPORTED UP ACTING SCRUB_STAMP DEEP_SCRUB_STAMP LAST_SCRUB_DURATION SCRUB_SCHEDULING 2026-03-31T20:31:16.937 INFO:tasks.workunit.client.0.vm03.stdout:1.0 2 0 0 0 459280 0 0 41 0 active+clean 4s 172'41 427:1001 [1,2]p1 [1,2]p1 2026-03-31T20:31:12.319938+0000 2026-03-31T20:31:12.319938+0000 0 periodic scrub scheduled @ 2026-04-02T07:44:17.231283+0000 2026-03-31T20:31:16.937 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:31:16.937 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-31T20:31:16.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2168: test_mon_pg: ceph pg ls 1 active stale 2026-03-31T20:31:17.155 INFO:tasks.workunit.client.0.vm03.stdout:PG OBJECTS DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS STATE SINCE VERSION REPORTED UP ACTING SCRUB_STAMP DEEP_SCRUB_STAMP LAST_SCRUB_DURATION SCRUB_SCHEDULING 2026-03-31T20:31:17.155 INFO:tasks.workunit.client.0.vm03.stdout:1.0 2 0 0 0 459280 0 0 41 0 active+clean 4s 172'41 427:1001 [1,2]p1 [1,2]p1 2026-03-31T20:31:12.319938+0000 2026-03-31T20:31:12.319938+0000 0 periodic scrub scheduled @ 2026-04-02T07:44:17.231283+0000 2026-03-31T20:31:17.155 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:31:17.155 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-31T20:31:17.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2169: test_mon_pg: ceph pg ls-by-primary osd.0 2026-03-31T20:31:17.372 INFO:tasks.workunit.client.0.vm03.stdout:PG OBJECTS DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS STATE SINCE VERSION REPORTED UP ACTING SCRUB_STAMP DEEP_SCRUB_STAMP LAST_SCRUB_DURATION SCRUB_SCHEDULING 2026-03-31T20:31:17.372 INFO:tasks.workunit.client.0.vm03.stdout:2.2 1 0 0 0 19 0 0 3 0 active+clean 118s 172'3 427:789 [0,1]p0 [0,1]p0 2026-03-31T20:26:31.889790+0000 2026-03-31T20:26:30.871302+0000 0 periodic scrub scheduled @ 2026-04-02T04:32:06.896389+0000 2026-03-31T20:31:17.372 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:31:17.372 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-31T20:31:17.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2170: test_mon_pg: ceph pg ls-by-primary osd.0 1 2026-03-31T20:31:17.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2171: test_mon_pg: ceph pg ls-by-primary osd.0 active 2026-03-31T20:31:17.804 INFO:tasks.workunit.client.0.vm03.stdout:PG OBJECTS DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS STATE SINCE VERSION REPORTED UP ACTING SCRUB_STAMP DEEP_SCRUB_STAMP LAST_SCRUB_DURATION SCRUB_SCHEDULING 2026-03-31T20:31:17.804 INFO:tasks.workunit.client.0.vm03.stdout:2.2 1 0 0 0 19 0 0 3 0 active+clean 118s 172'3 427:789 [0,1]p0 [0,1]p0 2026-03-31T20:26:31.889790+0000 2026-03-31T20:26:30.871302+0000 0 periodic scrub scheduled @ 2026-04-02T04:32:06.896389+0000 2026-03-31T20:31:17.804 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:31:17.804 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-31T20:31:17.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2172: test_mon_pg: ceph pg ls-by-primary osd.0 active stale 2026-03-31T20:31:18.027 INFO:tasks.workunit.client.0.vm03.stdout:PG OBJECTS DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS STATE SINCE VERSION REPORTED UP ACTING SCRUB_STAMP DEEP_SCRUB_STAMP LAST_SCRUB_DURATION SCRUB_SCHEDULING 2026-03-31T20:31:18.027 INFO:tasks.workunit.client.0.vm03.stdout:2.2 1 0 0 0 19 0 0 3 0 active+clean 119s 172'3 427:789 [0,1]p0 [0,1]p0 2026-03-31T20:26:31.889790+0000 2026-03-31T20:26:30.871302+0000 0 periodic scrub scheduled @ 2026-04-02T04:32:06.896389+0000 2026-03-31T20:31:18.027 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:31:18.027 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-31T20:31:18.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2173: test_mon_pg: ceph pg ls-by-primary osd.0 1 active stale 2026-03-31T20:31:18.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2174: test_mon_pg: ceph pg ls-by-osd osd.0 2026-03-31T20:31:18.459 INFO:tasks.workunit.client.0.vm03.stdout:PG OBJECTS DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS STATE SINCE VERSION REPORTED UP ACTING SCRUB_STAMP DEEP_SCRUB_STAMP LAST_SCRUB_DURATION SCRUB_SCHEDULING 2026-03-31T20:31:18.459 INFO:tasks.workunit.client.0.vm03.stdout:2.2 1 0 0 0 19 0 0 3 0 active+clean 119s 172'3 427:789 [0,1]p0 [0,1]p0 2026-03-31T20:26:31.889790+0000 2026-03-31T20:26:30.871302+0000 0 periodic scrub scheduled @ 2026-04-02T04:32:06.896389+0000 2026-03-31T20:31:18.459 INFO:tasks.workunit.client.0.vm03.stdout:2.4 0 0 0 0 0 0 0 0 0 active+clean 119s 0'0 427:895 [1,0]p1 [1,0]p1 2026-03-31T20:26:39.414769+0000 2026-03-31T20:26:33.464380+0000 0 periodic scrub scheduled @ 2026-04-02T01:40:05.887464+0000 2026-03-31T20:31:18.459 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:31:18.459 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-31T20:31:18.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2175: test_mon_pg: ceph pg ls-by-osd osd.0 1 2026-03-31T20:31:18.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2176: test_mon_pg: ceph pg ls-by-osd osd.0 active 2026-03-31T20:31:18.907 INFO:tasks.workunit.client.0.vm03.stdout:PG OBJECTS DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS STATE SINCE VERSION REPORTED UP ACTING SCRUB_STAMP DEEP_SCRUB_STAMP LAST_SCRUB_DURATION SCRUB_SCHEDULING 2026-03-31T20:31:18.907 INFO:tasks.workunit.client.0.vm03.stdout:2.2 1 0 0 0 19 0 0 3 0 active+clean 119s 172'3 427:789 [0,1]p0 [0,1]p0 2026-03-31T20:26:31.889790+0000 2026-03-31T20:26:30.871302+0000 0 periodic scrub scheduled @ 2026-04-02T04:32:06.896389+0000 2026-03-31T20:31:18.907 INFO:tasks.workunit.client.0.vm03.stdout:2.4 0 0 0 0 0 0 0 0 0 active+clean 119s 0'0 427:895 [1,0]p1 [1,0]p1 2026-03-31T20:26:39.414769+0000 2026-03-31T20:26:33.464380+0000 0 periodic scrub scheduled @ 2026-04-02T01:40:05.887464+0000 2026-03-31T20:31:18.907 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:31:18.907 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-31T20:31:18.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2177: test_mon_pg: ceph pg ls-by-osd osd.0 active stale 2026-03-31T20:31:19.129 INFO:tasks.workunit.client.0.vm03.stdout:PG OBJECTS DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS STATE SINCE VERSION REPORTED UP ACTING SCRUB_STAMP DEEP_SCRUB_STAMP LAST_SCRUB_DURATION SCRUB_SCHEDULING 2026-03-31T20:31:19.130 INFO:tasks.workunit.client.0.vm03.stdout:2.2 1 0 0 0 19 0 0 3 0 active+clean 2m 172'3 427:789 [0,1]p0 [0,1]p0 2026-03-31T20:26:31.889790+0000 2026-03-31T20:26:30.871302+0000 0 periodic scrub scheduled @ 2026-04-02T04:32:06.896389+0000 2026-03-31T20:31:19.130 INFO:tasks.workunit.client.0.vm03.stdout:2.4 0 0 0 0 0 0 0 0 0 active+clean 2m 0'0 427:895 [1,0]p1 [1,0]p1 2026-03-31T20:26:39.414769+0000 2026-03-31T20:26:33.464380+0000 0 periodic scrub scheduled @ 2026-04-02T01:40:05.887464+0000 2026-03-31T20:31:19.130 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:31:19.130 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-31T20:31:19.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2178: test_mon_pg: ceph pg ls-by-osd osd.0 1 active stale 2026-03-31T20:31:19.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2179: test_mon_pg: ceph pg ls-by-pool rbd 2026-03-31T20:31:19.569 INFO:tasks.workunit.client.0.vm03.stdout:PG OBJECTS DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS STATE SINCE VERSION REPORTED UP ACTING SCRUB_STAMP DEEP_SCRUB_STAMP LAST_SCRUB_DURATION SCRUB_SCHEDULING 2026-03-31T20:31:19.569 INFO:tasks.workunit.client.0.vm03.stdout:2.0 0 0 0 0 0 0 0 0 0 active+clean 4m 0'0 427:864 [2,1]p2 [2,1]p2 2026-03-31T20:26:33.059473+0000 2026-03-31T20:26:31.028583+0000 0 periodic scrub scheduled @ 2026-04-02T05:48:20.649520+0000 2026-03-31T20:31:19.569 INFO:tasks.workunit.client.0.vm03.stdout:2.1 0 0 0 0 0 0 0 0 0 active+clean 4m 0'0 427:864 [2,1]p2 [2,1]p2 2026-03-31T20:26:34.047600+0000 2026-03-31T20:26:32.053167+0000 0 periodic scrub scheduled @ 2026-04-02T07:14:32.884696+0000 2026-03-31T20:31:19.569 INFO:tasks.workunit.client.0.vm03.stdout:2.2 1 0 0 0 19 0 0 3 0 active+clean 2m 172'3 427:789 [0,1]p0 [0,1]p0 2026-03-31T20:26:31.889790+0000 2026-03-31T20:26:30.871302+0000 0 periodic scrub scheduled @ 2026-04-02T04:32:06.896389+0000 2026-03-31T20:31:19.569 INFO:tasks.workunit.client.0.vm03.stdout:2.3 1 0 0 0 0 0 0 2 0 active+clean 4m 172'2 427:874 [1,2]p1 [1,2]p1 2026-03-31T20:26:37.430905+0000 2026-03-31T20:26:31.466479+0000 0 periodic scrub scheduled @ 2026-04-02T04:16:40.214034+0000 2026-03-31T20:31:19.569 INFO:tasks.workunit.client.0.vm03.stdout:2.4 0 0 0 0 0 0 0 0 0 active+clean 2m 0'0 427:895 [1,0]p1 [1,0]p1 2026-03-31T20:26:39.414769+0000 2026-03-31T20:26:33.464380+0000 0 periodic scrub scheduled @ 2026-04-02T01:40:05.887464+0000 2026-03-31T20:31:19.569 INFO:tasks.workunit.client.0.vm03.stdout:2.5 0 0 0 0 0 0 0 0 0 active+clean 2m 0'0 427:896 [1,2]p1 [1,2]p1 2026-03-31T20:26:38.458279+0000 2026-03-31T20:26:32.494594+0000 0 periodic scrub scheduled @ 2026-04-01T22:25:49.071901+0000 2026-03-31T20:31:19.569 INFO:tasks.workunit.client.0.vm03.stdout:2.6 0 0 0 0 0 0 0 0 0 active+clean 2m 0'0 427:896 [1,2]p1 [1,2]p1 2026-03-31T20:26:41.404628+0000 2026-03-31T20:26:35.469461+0000 0 periodic scrub scheduled @ 2026-04-02T03:20:25.566652+0000 2026-03-31T20:31:19.569 INFO:tasks.workunit.client.0.vm03.stdout:2.7 0 0 0 0 0 0 0 0 0 active+clean 2m 0'0 427:896 [1,2]p1 [1,2]p1 2026-03-31T20:26:40.426768+0000 2026-03-31T20:26:34.453503+0000 0 periodic scrub scheduled @ 2026-04-02T02:54:10.033462+0000 2026-03-31T20:31:19.569 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:31:19.569 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-31T20:31:19.582 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2180: test_mon_pg: ceph pg ls-by-pool rbd active stale 2026-03-31T20:31:19.784 INFO:tasks.workunit.client.0.vm03.stdout:PG OBJECTS DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS STATE SINCE VERSION REPORTED UP ACTING SCRUB_STAMP DEEP_SCRUB_STAMP LAST_SCRUB_DURATION SCRUB_SCHEDULING 2026-03-31T20:31:19.784 INFO:tasks.workunit.client.0.vm03.stdout:2.0 0 0 0 0 0 0 0 0 0 active+clean 4m 0'0 427:864 [2,1]p2 [2,1]p2 2026-03-31T20:26:33.059473+0000 2026-03-31T20:26:31.028583+0000 0 periodic scrub scheduled @ 2026-04-02T05:48:20.649520+0000 2026-03-31T20:31:19.784 INFO:tasks.workunit.client.0.vm03.stdout:2.1 0 0 0 0 0 0 0 0 0 active+clean 4m 0'0 427:864 [2,1]p2 [2,1]p2 2026-03-31T20:26:34.047600+0000 2026-03-31T20:26:32.053167+0000 0 periodic scrub scheduled @ 2026-04-02T07:14:32.884696+0000 2026-03-31T20:31:19.784 INFO:tasks.workunit.client.0.vm03.stdout:2.2 1 0 0 0 19 0 0 3 0 active+clean 2m 172'3 427:789 [0,1]p0 [0,1]p0 2026-03-31T20:26:31.889790+0000 2026-03-31T20:26:30.871302+0000 0 periodic scrub scheduled @ 2026-04-02T04:32:06.896389+0000 2026-03-31T20:31:19.784 INFO:tasks.workunit.client.0.vm03.stdout:2.3 1 0 0 0 0 0 0 2 0 active+clean 4m 172'2 427:874 [1,2]p1 [1,2]p1 2026-03-31T20:26:37.430905+0000 2026-03-31T20:26:31.466479+0000 0 periodic scrub scheduled @ 2026-04-02T04:16:40.214034+0000 2026-03-31T20:31:19.784 INFO:tasks.workunit.client.0.vm03.stdout:2.4 0 0 0 0 0 0 0 0 0 active+clean 2m 0'0 427:895 [1,0]p1 [1,0]p1 2026-03-31T20:26:39.414769+0000 2026-03-31T20:26:33.464380+0000 0 periodic scrub scheduled @ 2026-04-02T01:40:05.887464+0000 2026-03-31T20:31:19.784 INFO:tasks.workunit.client.0.vm03.stdout:2.5 0 0 0 0 0 0 0 0 0 active+clean 2m 0'0 427:896 [1,2]p1 [1,2]p1 2026-03-31T20:26:38.458279+0000 2026-03-31T20:26:32.494594+0000 0 periodic scrub scheduled @ 2026-04-01T22:25:49.071901+0000 2026-03-31T20:31:19.784 INFO:tasks.workunit.client.0.vm03.stdout:2.6 0 0 0 0 0 0 0 0 0 active+clean 2m 0'0 427:896 [1,2]p1 [1,2]p1 2026-03-31T20:26:41.404628+0000 2026-03-31T20:26:35.469461+0000 0 periodic scrub scheduled @ 2026-04-02T03:20:25.566652+0000 2026-03-31T20:31:19.784 INFO:tasks.workunit.client.0.vm03.stdout:2.7 0 0 0 0 0 0 0 0 0 active+clean 2m 0'0 427:896 [1,2]p1 [1,2]p1 2026-03-31T20:26:40.426768+0000 2026-03-31T20:26:34.453503+0000 0 periodic scrub scheduled @ 2026-04-02T02:54:10.033462+0000 2026-03-31T20:31:19.784 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:31:19.784 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-31T20:31:19.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2183: test_mon_pg: ceph pg getmap -o /tmp/cephtool.sYl/map.26274 2026-03-31T20:31:19.997 INFO:tasks.workunit.client.0.vm03.stderr:got pgmap version 461 2026-03-31T20:31:20.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2184: test_mon_pg: '[' -s /tmp/cephtool.sYl/map.26274 ']' 2026-03-31T20:31:20.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2185: test_mon_pg: ceph pg map 1.0 2026-03-31T20:31:20.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2185: test_mon_pg: grep acting 2026-03-31T20:31:20.222 INFO:tasks.workunit.client.0.vm03.stdout:osdmap e427 pg 1.0 (1.0) -> up [1,2] acting [1,2] 2026-03-31T20:31:20.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2186: test_mon_pg: ceph pg repair 1.0 2026-03-31T20:31:20.426 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 1.0 on osd.1 to repair 2026-03-31T20:31:20.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2187: test_mon_pg: ceph pg scrub 1.0 2026-03-31T20:31:20.639 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 1.0 on osd.1 to scrub 2026-03-31T20:31:20.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2189: test_mon_pg: ceph osd set-full-ratio .962 2026-03-31T20:31:21.898 INFO:tasks.workunit.client.0.vm03.stderr:osd set-full-ratio 0.962 2026-03-31T20:31:21.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2190: test_mon_pg: ceph osd dump 2026-03-31T20:31:21.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2190: test_mon_pg: grep '^full_ratio 0.962' 2026-03-31T20:31:22.204 INFO:tasks.workunit.client.0.vm03.stdout:full_ratio 0.962 2026-03-31T20:31:22.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2191: test_mon_pg: ceph osd set-backfillfull-ratio .912 2026-03-31T20:31:23.910 INFO:tasks.workunit.client.0.vm03.stderr:osd set-backfillfull-ratio 0.912 2026-03-31T20:31:23.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2192: test_mon_pg: ceph osd dump 2026-03-31T20:31:23.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2192: test_mon_pg: grep '^backfillfull_ratio 0.912' 2026-03-31T20:31:24.164 INFO:tasks.workunit.client.0.vm03.stdout:backfillfull_ratio 0.912 2026-03-31T20:31:24.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2193: test_mon_pg: ceph osd set-nearfull-ratio .892 2026-03-31T20:31:25.932 INFO:tasks.workunit.client.0.vm03.stderr:osd set-nearfull-ratio 0.892 2026-03-31T20:31:25.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2194: test_mon_pg: ceph osd dump 2026-03-31T20:31:25.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2194: test_mon_pg: grep '^nearfull_ratio 0.892' 2026-03-31T20:31:26.179 INFO:tasks.workunit.client.0.vm03.stdout:nearfull_ratio 0.892 2026-03-31T20:31:26.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2197: test_mon_pg: ceph osd set-nearfull-ratio .913 2026-03-31T20:31:26.931 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:31:26.927+0000 7f870ab89640 -1 log_channel(cluster) log [ERR] : Health check failed: full ratio(s) out of order (OSD_OUT_OF_ORDER_FULL) 2026-03-31T20:31:27.948 INFO:tasks.workunit.client.0.vm03.stderr:osd set-nearfull-ratio 0.913 2026-03-31T20:31:27.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2198: test_mon_pg: ceph health -f json 2026-03-31T20:31:27.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2198: test_mon_pg: grep OSD_OUT_OF_ORDER_FULL 2026-03-31T20:31:28.218 INFO:tasks.workunit.client.0.vm03.stdout:{"status":"HEALTH_ERR","checks":{"OSD_OUT_OF_ORDER_FULL":{"severity":"HEALTH_ERR","summary":{"message":"full ratio(s) out of order","count":0},"muted":false}},"mutes":[]} 2026-03-31T20:31:28.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2199: test_mon_pg: ceph health detail 2026-03-31T20:31:28.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2199: test_mon_pg: grep OSD_OUT_OF_ORDER_FULL 2026-03-31T20:31:28.511 INFO:tasks.workunit.client.0.vm03.stdout:[ERR] OSD_OUT_OF_ORDER_FULL: full ratio(s) out of order 2026-03-31T20:31:28.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2200: test_mon_pg: ceph osd set-nearfull-ratio .892 2026-03-31T20:31:29.960 INFO:tasks.workunit.client.0.vm03.stderr:osd set-nearfull-ratio 0.892 2026-03-31T20:31:29.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2201: test_mon_pg: ceph osd set-backfillfull-ratio .963 2026-03-31T20:31:30.965 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:31:30.963+0000 7f870ab89640 -1 log_channel(cluster) log [ERR] : Health check failed: full ratio(s) out of order (OSD_OUT_OF_ORDER_FULL) 2026-03-31T20:31:31.980 INFO:tasks.workunit.client.0.vm03.stderr:osd set-backfillfull-ratio 0.963 2026-03-31T20:31:31.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2202: test_mon_pg: ceph health -f json 2026-03-31T20:31:31.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2202: test_mon_pg: grep OSD_OUT_OF_ORDER_FULL 2026-03-31T20:31:32.236 INFO:tasks.workunit.client.0.vm03.stdout:{"status":"HEALTH_ERR","checks":{"OSD_OUT_OF_ORDER_FULL":{"severity":"HEALTH_ERR","summary":{"message":"full ratio(s) out of order","count":0},"muted":false}},"mutes":[]} 2026-03-31T20:31:32.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2203: test_mon_pg: ceph health detail 2026-03-31T20:31:32.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2203: test_mon_pg: grep OSD_OUT_OF_ORDER_FULL 2026-03-31T20:31:32.517 INFO:tasks.workunit.client.0.vm03.stdout:[ERR] OSD_OUT_OF_ORDER_FULL: full ratio(s) out of order 2026-03-31T20:31:32.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2204: test_mon_pg: ceph osd set-backfillfull-ratio .912 2026-03-31T20:31:33.991 INFO:tasks.workunit.client.0.vm03.stderr:osd set-backfillfull-ratio 0.912 2026-03-31T20:31:34.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2207: test_mon_pg: sudo ceph tell osd.0 injectfull nearfull 2026-03-31T20:31:34.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2208: test_mon_pg: wait_for_health OSD_NEARFULL 2026-03-31T20:31:34.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1834: wait_for_health: local grepstr=OSD_NEARFULL 2026-03-31T20:31:34.080 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: get_timeout_delays 300 .1 2026-03-31T20:31:34.081 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: shopt -q -o xtrace 2026-03-31T20:31:34.081 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: echo true 2026-03-31T20:31:34.081 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: local trace=true 2026-03-31T20:31:34.081 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: true 2026-03-31T20:31:34.081 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: shopt -u -o xtrace 2026-03-31T20:31:34.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '4.5') 2026-03-31T20:31:34.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: local -a delays 2026-03-31T20:31:34.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1836: wait_for_health: local -i loop=0 2026-03-31T20:31:34.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:34.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep OSD_NEARFULL 2026-03-31T20:31:34.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 0 >= 27 )) 2026-03-31T20:31:34.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.1 2026-03-31T20:31:34.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:34.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:34.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep OSD_NEARFULL 2026-03-31T20:31:34.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 1 >= 27 )) 2026-03-31T20:31:34.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.2 2026-03-31T20:31:35.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:35.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:35.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep OSD_NEARFULL 2026-03-31T20:31:35.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 2 >= 27 )) 2026-03-31T20:31:35.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.4 2026-03-31T20:31:35.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:35.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:35.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep OSD_NEARFULL 2026-03-31T20:31:35.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 3 >= 27 )) 2026-03-31T20:31:35.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.8 2026-03-31T20:31:36.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:36.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:36.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep OSD_NEARFULL 2026-03-31T20:31:37.070 INFO:tasks.workunit.client.0.vm03.stdout:[WRN] OSD_NEARFULL: 1 nearfull osd(s) 2026-03-31T20:31:37.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2209: test_mon_pg: ceph health detail 2026-03-31T20:31:37.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2209: test_mon_pg: grep 'osd.0 is near full' 2026-03-31T20:31:37.330 INFO:tasks.workunit.client.0.vm03.stdout: osd.0 is near full 2026-03-31T20:31:37.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2210: test_mon_pg: sudo ceph tell osd.0 injectfull none 2026-03-31T20:31:37.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2211: test_mon_pg: wait_for_health_ok 2026-03-31T20:31:37.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1855: wait_for_health_ok: wait_for_health HEALTH_OK 2026-03-31T20:31:37.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1834: wait_for_health: local grepstr=HEALTH_OK 2026-03-31T20:31:37.410 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: get_timeout_delays 300 .1 2026-03-31T20:31:37.411 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: shopt -q -o xtrace 2026-03-31T20:31:37.411 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: echo true 2026-03-31T20:31:37.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: local trace=true 2026-03-31T20:31:37.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: true 2026-03-31T20:31:37.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: shopt -u -o xtrace 2026-03-31T20:31:37.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '4.5') 2026-03-31T20:31:37.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: local -a delays 2026-03-31T20:31:37.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1836: wait_for_health: local -i loop=0 2026-03-31T20:31:37.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:37.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep HEALTH_OK 2026-03-31T20:31:37.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 0 >= 27 )) 2026-03-31T20:31:37.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.1 2026-03-31T20:31:37.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:37.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:37.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep HEALTH_OK 2026-03-31T20:31:38.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 1 >= 27 )) 2026-03-31T20:31:38.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.2 2026-03-31T20:31:38.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:38.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:38.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep HEALTH_OK 2026-03-31T20:31:38.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 2 >= 27 )) 2026-03-31T20:31:38.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.4 2026-03-31T20:31:39.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:39.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:39.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep HEALTH_OK 2026-03-31T20:31:39.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 3 >= 27 )) 2026-03-31T20:31:39.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.8 2026-03-31T20:31:40.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:40.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:40.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep HEALTH_OK 2026-03-31T20:31:40.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 4 >= 27 )) 2026-03-31T20:31:40.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 1.6 2026-03-31T20:31:42.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:42.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:42.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep HEALTH_OK 2026-03-31T20:31:42.303 INFO:tasks.workunit.client.0.vm03.stdout:HEALTH_OK 2026-03-31T20:31:42.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2213: test_mon_pg: sudo ceph tell osd.1 injectfull backfillfull 2026-03-31T20:31:42.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2214: test_mon_pg: wait_for_health OSD_BACKFILLFULL 2026-03-31T20:31:42.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1834: wait_for_health: local grepstr=OSD_BACKFILLFULL 2026-03-31T20:31:42.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: get_timeout_delays 300 .1 2026-03-31T20:31:42.387 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: shopt -q -o xtrace 2026-03-31T20:31:42.387 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: echo true 2026-03-31T20:31:42.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: local trace=true 2026-03-31T20:31:42.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: true 2026-03-31T20:31:42.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: shopt -u -o xtrace 2026-03-31T20:31:42.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '4.5') 2026-03-31T20:31:42.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: local -a delays 2026-03-31T20:31:42.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1836: wait_for_health: local -i loop=0 2026-03-31T20:31:42.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:42.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep OSD_BACKFILLFULL 2026-03-31T20:31:42.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 0 >= 27 )) 2026-03-31T20:31:42.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.1 2026-03-31T20:31:42.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:42.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:42.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep OSD_BACKFILLFULL 2026-03-31T20:31:43.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 1 >= 27 )) 2026-03-31T20:31:43.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.2 2026-03-31T20:31:43.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:43.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:43.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep OSD_BACKFILLFULL 2026-03-31T20:31:43.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 2 >= 27 )) 2026-03-31T20:31:43.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.4 2026-03-31T20:31:44.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:44.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:44.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep OSD_BACKFILLFULL 2026-03-31T20:31:44.328 INFO:tasks.workunit.client.0.vm03.stdout:[WRN] OSD_BACKFILLFULL: 1 backfillfull osd(s) 2026-03-31T20:31:44.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2215: test_mon_pg: ceph health detail 2026-03-31T20:31:44.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2215: test_mon_pg: grep 'osd.1 is backfill full' 2026-03-31T20:31:44.596 INFO:tasks.workunit.client.0.vm03.stdout: osd.1 is backfill full 2026-03-31T20:31:44.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2216: test_mon_pg: sudo ceph tell osd.1 injectfull none 2026-03-31T20:31:44.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2217: test_mon_pg: wait_for_health_ok 2026-03-31T20:31:44.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1855: wait_for_health_ok: wait_for_health HEALTH_OK 2026-03-31T20:31:44.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1834: wait_for_health: local grepstr=HEALTH_OK 2026-03-31T20:31:44.674 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: get_timeout_delays 300 .1 2026-03-31T20:31:44.675 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: shopt -q -o xtrace 2026-03-31T20:31:44.675 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: echo true 2026-03-31T20:31:44.675 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: local trace=true 2026-03-31T20:31:44.675 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: true 2026-03-31T20:31:44.675 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: shopt -u -o xtrace 2026-03-31T20:31:44.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '4.5') 2026-03-31T20:31:44.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: local -a delays 2026-03-31T20:31:44.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1836: wait_for_health: local -i loop=0 2026-03-31T20:31:44.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:44.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep HEALTH_OK 2026-03-31T20:31:45.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 0 >= 27 )) 2026-03-31T20:31:45.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.1 2026-03-31T20:31:45.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:45.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:45.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep HEALTH_OK 2026-03-31T20:31:45.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 1 >= 27 )) 2026-03-31T20:31:45.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.2 2026-03-31T20:31:45.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:45.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:45.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep HEALTH_OK 2026-03-31T20:31:45.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 2 >= 27 )) 2026-03-31T20:31:45.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.4 2026-03-31T20:31:46.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:46.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:46.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep HEALTH_OK 2026-03-31T20:31:46.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 3 >= 27 )) 2026-03-31T20:31:46.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.8 2026-03-31T20:31:47.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:47.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:47.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep HEALTH_OK 2026-03-31T20:31:47.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 4 >= 27 )) 2026-03-31T20:31:47.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 1.6 2026-03-31T20:31:49.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:49.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:49.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep HEALTH_OK 2026-03-31T20:31:49.538 INFO:tasks.workunit.client.0.vm03.stdout:HEALTH_OK 2026-03-31T20:31:49.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2219: test_mon_pg: sudo ceph tell osd.2 injectfull failsafe 2026-03-31T20:31:49.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2221: test_mon_pg: wait_for_health OSD_FULL 2026-03-31T20:31:49.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1834: wait_for_health: local grepstr=OSD_FULL 2026-03-31T20:31:49.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: get_timeout_delays 300 .1 2026-03-31T20:31:49.619 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: shopt -q -o xtrace 2026-03-31T20:31:49.619 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: echo true 2026-03-31T20:31:49.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: local trace=true 2026-03-31T20:31:49.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: true 2026-03-31T20:31:49.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: shopt -u -o xtrace 2026-03-31T20:31:49.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '4.5') 2026-03-31T20:31:49.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: local -a delays 2026-03-31T20:31:49.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1836: wait_for_health: local -i loop=0 2026-03-31T20:31:49.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:49.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep OSD_FULL 2026-03-31T20:31:50.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 0 >= 27 )) 2026-03-31T20:31:50.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.1 2026-03-31T20:31:50.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:50.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:50.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep OSD_FULL 2026-03-31T20:31:50.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 1 >= 27 )) 2026-03-31T20:31:50.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.2 2026-03-31T20:31:50.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:50.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:50.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep OSD_FULL 2026-03-31T20:31:50.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 2 >= 27 )) 2026-03-31T20:31:50.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.4 2026-03-31T20:31:51.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:51.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:51.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep OSD_FULL 2026-03-31T20:31:51.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 3 >= 27 )) 2026-03-31T20:31:51.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.8 2026-03-31T20:31:52.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:52.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:52.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep OSD_FULL 2026-03-31T20:31:52.504 INFO:tasks.ceph.osd.2.vm03.stderr:2026-03-31T20:31:52.499+0000 7fdfe4390640 -1 log_channel(cluster) log [ERR] : full status failsafe engaged, dropping updates, now 0% full 2026-03-31T20:31:52.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 4 >= 27 )) 2026-03-31T20:31:52.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 1.6 2026-03-31T20:31:53.172 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:31:53.171+0000 7f870ab89640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 full osd(s) (OSD_FULL) 2026-03-31T20:31:54.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:54.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:54.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep OSD_FULL 2026-03-31T20:31:54.460 INFO:tasks.workunit.client.0.vm03.stdout:[ERR] OSD_FULL: 1 full osd(s) 2026-03-31T20:31:54.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2222: test_mon_pg: ceph health detail 2026-03-31T20:31:54.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2222: test_mon_pg: grep 'osd.2 is full' 2026-03-31T20:31:54.721 INFO:tasks.workunit.client.0.vm03.stdout: osd.2 is full 2026-03-31T20:31:54.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2223: test_mon_pg: sudo ceph tell osd.2 injectfull none 2026-03-31T20:31:54.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2224: test_mon_pg: wait_for_health_ok 2026-03-31T20:31:54.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1855: wait_for_health_ok: wait_for_health HEALTH_OK 2026-03-31T20:31:54.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1834: wait_for_health: local grepstr=HEALTH_OK 2026-03-31T20:31:54.799 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: get_timeout_delays 300 .1 2026-03-31T20:31:54.800 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: shopt -q -o xtrace 2026-03-31T20:31:54.800 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: echo true 2026-03-31T20:31:54.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: local trace=true 2026-03-31T20:31:54.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: true 2026-03-31T20:31:54.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: shopt -u -o xtrace 2026-03-31T20:31:54.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '4.5') 2026-03-31T20:31:54.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: local -a delays 2026-03-31T20:31:54.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1836: wait_for_health: local -i loop=0 2026-03-31T20:31:54.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:54.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep HEALTH_OK 2026-03-31T20:31:55.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 0 >= 27 )) 2026-03-31T20:31:55.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.1 2026-03-31T20:31:55.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:55.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:55.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep HEALTH_OK 2026-03-31T20:31:55.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 1 >= 27 )) 2026-03-31T20:31:55.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.2 2026-03-31T20:31:55.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:55.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:55.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep HEALTH_OK 2026-03-31T20:31:56.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 2 >= 27 )) 2026-03-31T20:31:56.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.4 2026-03-31T20:31:56.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:56.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:56.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep HEALTH_OK 2026-03-31T20:31:56.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 3 >= 27 )) 2026-03-31T20:31:56.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.8 2026-03-31T20:31:57.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:57.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:57.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep HEALTH_OK 2026-03-31T20:31:57.705 INFO:tasks.ceph.osd.2.vm03.stderr:2026-03-31T20:31:57.703+0000 7fdfe4390640 -1 log_channel(cluster) log [ERR] : full status failsafe disengaged, no longer dropping updates, now 0% full 2026-03-31T20:31:57.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 4 >= 27 )) 2026-03-31T20:31:57.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 1.6 2026-03-31T20:31:59.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:31:59.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:59.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep HEALTH_OK 2026-03-31T20:31:59.676 INFO:tasks.workunit.client.0.vm03.stdout:HEALTH_OK 2026-03-31T20:31:59.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2226: test_mon_pg: sudo ceph tell osd.0 injectfull full 2026-03-31T20:31:59.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2227: test_mon_pg: wait_for_health OSD_FULL 2026-03-31T20:31:59.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1834: wait_for_health: local grepstr=OSD_FULL 2026-03-31T20:31:59.758 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: get_timeout_delays 300 .1 2026-03-31T20:31:59.759 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: shopt -q -o xtrace 2026-03-31T20:31:59.759 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: echo true 2026-03-31T20:31:59.759 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: local trace=true 2026-03-31T20:31:59.759 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: true 2026-03-31T20:31:59.759 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: shopt -u -o xtrace 2026-03-31T20:31:59.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '4.5') 2026-03-31T20:31:59.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: local -a delays 2026-03-31T20:31:59.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1836: wait_for_health: local -i loop=0 2026-03-31T20:31:59.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:31:59.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep OSD_FULL 2026-03-31T20:32:00.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 0 >= 27 )) 2026-03-31T20:32:00.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.1 2026-03-31T20:32:00.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:32:00.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:32:00.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep OSD_FULL 2026-03-31T20:32:00.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 1 >= 27 )) 2026-03-31T20:32:00.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.2 2026-03-31T20:32:00.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:32:00.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:32:00.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep OSD_FULL 2026-03-31T20:32:01.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 2 >= 27 )) 2026-03-31T20:32:01.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.4 2026-03-31T20:32:01.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:32:01.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:32:01.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep OSD_FULL 2026-03-31T20:32:01.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 3 >= 27 )) 2026-03-31T20:32:01.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.8 2026-03-31T20:32:02.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:32:02.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:32:02.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep OSD_FULL 2026-03-31T20:32:02.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 4 >= 27 )) 2026-03-31T20:32:02.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 1.6 2026-03-31T20:32:03.222 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:32:03.219+0000 7f870ab89640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 full osd(s) (OSD_FULL) 2026-03-31T20:32:04.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:32:04.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:32:04.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep OSD_FULL 2026-03-31T20:32:04.619 INFO:tasks.workunit.client.0.vm03.stdout:[ERR] OSD_FULL: 1 full osd(s) 2026-03-31T20:32:04.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2228: test_mon_pg: ceph health detail 2026-03-31T20:32:04.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2228: test_mon_pg: grep 'osd.0 is full' 2026-03-31T20:32:04.890 INFO:tasks.workunit.client.0.vm03.stdout: osd.0 is full 2026-03-31T20:32:04.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2229: test_mon_pg: sudo ceph tell osd.0 injectfull none 2026-03-31T20:32:04.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2230: test_mon_pg: wait_for_health_ok 2026-03-31T20:32:04.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1855: wait_for_health_ok: wait_for_health HEALTH_OK 2026-03-31T20:32:04.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1834: wait_for_health: local grepstr=HEALTH_OK 2026-03-31T20:32:04.969 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: get_timeout_delays 300 .1 2026-03-31T20:32:04.969 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: shopt -q -o xtrace 2026-03-31T20:32:04.969 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: echo true 2026-03-31T20:32:04.970 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: local trace=true 2026-03-31T20:32:04.970 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: true 2026-03-31T20:32:04.970 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: shopt -u -o xtrace 2026-03-31T20:32:05.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '15' '4.5') 2026-03-31T20:32:05.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1835: wait_for_health: local -a delays 2026-03-31T20:32:05.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1836: wait_for_health: local -i loop=0 2026-03-31T20:32:05.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:32:05.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep HEALTH_OK 2026-03-31T20:32:05.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 0 >= 27 )) 2026-03-31T20:32:05.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.1 2026-03-31T20:32:05.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:32:05.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:32:05.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep HEALTH_OK 2026-03-31T20:32:05.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 1 >= 27 )) 2026-03-31T20:32:05.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.2 2026-03-31T20:32:05.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:32:05.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:32:05.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep HEALTH_OK 2026-03-31T20:32:06.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 2 >= 27 )) 2026-03-31T20:32:06.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.4 2026-03-31T20:32:06.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:32:06.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:32:06.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep HEALTH_OK 2026-03-31T20:32:06.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 3 >= 27 )) 2026-03-31T20:32:06.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 0.8 2026-03-31T20:32:07.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:32:07.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:32:07.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep HEALTH_OK 2026-03-31T20:32:08.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 4 >= 27 )) 2026-03-31T20:32:08.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 1.6 2026-03-31T20:32:09.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:32:09.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:32:09.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep HEALTH_OK 2026-03-31T20:32:09.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1839: wait_for_health: (( 5 >= 27 )) 2026-03-31T20:32:09.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1843: wait_for_health: sleep 3.2 2026-03-31T20:32:13.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1844: wait_for_health: loop+=1 2026-03-31T20:32:13.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: ceph health detail 2026-03-31T20:32:13.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1838: wait_for_health: grep HEALTH_OK 2026-03-31T20:32:13.371 INFO:tasks.workunit.client.0.vm03.stdout:HEALTH_OK 2026-03-31T20:32:13.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2232: test_mon_pg: ceph pg stat 2026-03-31T20:32:13.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2232: test_mon_pg: grep pgs: 2026-03-31T20:32:13.577 INFO:tasks.workunit.client.0.vm03.stdout:9 pgs: 9 active+clean; 449 KiB data, 90 MiB used, 270 GiB / 270 GiB avail 2026-03-31T20:32:13.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2233: test_mon_pg: ceph pg 1.0 query 2026-03-31T20:32:13.653 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:32:13.653 INFO:tasks.workunit.client.0.vm03.stdout: "snap_trimq": "[]", 2026-03-31T20:32:13.653 INFO:tasks.workunit.client.0.vm03.stdout: "snap_trimq_len": 0, 2026-03-31T20:32:13.653 INFO:tasks.workunit.client.0.vm03.stdout: "state": "active+clean", 2026-03-31T20:32:13.653 INFO:tasks.workunit.client.0.vm03.stdout: "epoch": 449, 2026-03-31T20:32:13.653 INFO:tasks.workunit.client.0.vm03.stdout: "up": [ 2026-03-31T20:32:13.653 INFO:tasks.workunit.client.0.vm03.stdout: 1, 2026-03-31T20:32:13.653 INFO:tasks.workunit.client.0.vm03.stdout: 2 2026-03-31T20:32:13.653 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:32:13.653 INFO:tasks.workunit.client.0.vm03.stdout: "acting": [ 2026-03-31T20:32:13.653 INFO:tasks.workunit.client.0.vm03.stdout: 1, 2026-03-31T20:32:13.653 INFO:tasks.workunit.client.0.vm03.stdout: 2 2026-03-31T20:32:13.653 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:32:13.653 INFO:tasks.workunit.client.0.vm03.stdout: "acting_recovery_backfill": [ 2026-03-31T20:32:13.653 INFO:tasks.workunit.client.0.vm03.stdout: "1", 2026-03-31T20:32:13.653 INFO:tasks.workunit.client.0.vm03.stdout: "2" 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "info": { 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "pgid": "1.0", 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "shard": "255", 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "last_update": "172'41", 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "last_complete": "172'41", 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "log_tail": "0'0", 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "last_user_version": 39, 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "last_backfill": "MAX", 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "partial_writes_last_complete": [], 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "partial_writes_last_complete_epoch": "0", 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "purged_snaps": [], 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "history": { 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "epoch_created": 9, 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "epoch_pool_created": 9, 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "last_epoch_started": 328, 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "last_interval_started": 327, 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "last_epoch_clean": 328, 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "last_interval_clean": 327, 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "last_epoch_split": 0, 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "last_epoch_marked_full": 448, 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "same_up_since": 327, 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "same_interval_since": 327, 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "same_primary_since": 9, 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "last_scrub": "172'41", 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "last_scrub_stamp": "2026-03-31T20:31:22.320510+0000", 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "last_deep_scrub": "172'41", 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "last_deep_scrub_stamp": "2026-03-31T20:31:21.304517+0000", 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "last_clean_scrub_stamp": "2026-03-31T20:31:22.320510+0000", 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "prior_readable_until_ub": 0 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "stats": { 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "version": "172'41", 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "reported_seq": 1071, 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "reported_epoch": 449, 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "state": "active+clean", 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "last_fresh": "2026-03-31T20:32:10.735600+0000", 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "last_change": "2026-03-31T20:31:22.320553+0000", 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "last_active": "2026-03-31T20:32:10.735600+0000", 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "last_peered": "2026-03-31T20:32:10.735600+0000", 2026-03-31T20:32:13.654 INFO:tasks.workunit.client.0.vm03.stdout: "last_clean": "2026-03-31T20:32:10.735600+0000", 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "last_became_active": "2026-03-31T20:29:15.913130+0000", 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "last_became_peered": "2026-03-31T20:29:15.913130+0000", 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "last_unstale": "2026-03-31T20:32:10.735600+0000", 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "last_undegraded": "2026-03-31T20:32:10.735600+0000", 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "last_fullsized": "2026-03-31T20:32:10.735600+0000", 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "mapping_epoch": 327, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "log_start": "0'0", 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "ondisk_log_start": "0'0", 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "created": 9, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "last_epoch_clean": 328, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "parent": "0.0", 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "parent_split_bits": 0, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "last_scrub": "172'41", 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "last_scrub_stamp": "2026-03-31T20:31:22.320510+0000", 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "last_deep_scrub": "172'41", 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "last_deep_scrub_stamp": "2026-03-31T20:31:21.304517+0000", 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "last_clean_scrub_stamp": "2026-03-31T20:31:22.320510+0000", 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "objects_scrubbed": 2, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "log_size": 41, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "log_dups_size": 0, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "ondisk_log_size": 41, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "stats_invalid": false, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "dirty_stats_invalid": false, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "omap_stats_invalid": false, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "hitset_stats_invalid": false, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "hitset_bytes_stats_invalid": false, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "pin_stats_invalid": false, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "manifest_stats_invalid": false, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "snaptrimq_len": 0, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "last_scrub_duration": 0, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_schedule": "periodic scrub scheduled @ 2026-04-01T21:14:15.870060+0000", 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_duration": 0, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "objects_trimmed": 0, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "snaptrim_duration": 0, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "stat_sum": { 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "num_bytes": 459280, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects": 2, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "num_object_clones": 0, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "num_object_copies": 4, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_missing_on_primary": 0, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_missing": 0, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_degraded": 0, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_misplaced": 0, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_unfound": 0, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_dirty": 2, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "num_whiteouts": 0, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "num_read": 106, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "num_read_kb": 213, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "num_write": 69, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "num_write_kb": 584, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrub_errors": 0, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "num_shallow_scrub_errors": 0, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "num_deep_scrub_errors": 0, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_recovered": 6, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "num_bytes_recovered": 1377840, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "num_keys_recovered": 0, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_omap": 0, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_hit_set_archive": 0, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "num_bytes_hit_set_archive": 0, 2026-03-31T20:32:13.655 INFO:tasks.workunit.client.0.vm03.stdout: "num_flush": 0, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "num_flush_kb": 0, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "num_evict": 0, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "num_evict_kb": 0, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "num_promote": 0, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "num_flush_mode_high": 0, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "num_flush_mode_low": 0, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "num_evict_mode_some": 0, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "num_evict_mode_full": 0, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_pinned": 0, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "num_legacy_snapsets": 0, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "num_large_omap_objects": 0, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_manifest": 0, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "num_omap_bytes": 0, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "num_omap_keys": 0, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_repaired": 0 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "up": [ 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: 1, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: 2 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "acting": [ 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: 1, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: 2 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "avail_no_missing": [], 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "object_location_counts": [], 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "blocked_by": [], 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "up_primary": 1, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "acting_primary": 1, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "purged_snaps": [] 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "empty": 0, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "dne": 0, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "incomplete": 0, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "last_epoch_started": 328, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "hit_set_history": { 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "current_last_update": "0'0", 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "history": [] 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "peer_info": [ 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "peer": "2", 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "pgid": "1.0", 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "shard": "255", 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "last_update": "172'41", 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "last_complete": "0'0", 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "log_tail": "0'0", 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "last_user_version": 0, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "last_backfill": "MAX", 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "partial_writes_last_complete": [], 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "partial_writes_last_complete_epoch": "0", 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "purged_snaps": [], 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "history": { 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "epoch_created": 9, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "epoch_pool_created": 9, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "last_epoch_started": 328, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "last_interval_started": 327, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "last_epoch_clean": 328, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "last_interval_clean": 327, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "last_epoch_split": 0, 2026-03-31T20:32:13.656 INFO:tasks.workunit.client.0.vm03.stdout: "last_epoch_marked_full": 0, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "same_up_since": 0, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "same_interval_since": 0, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "same_primary_since": 0, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "last_scrub": "172'41", 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "last_scrub_stamp": "2026-03-31T20:31:22.320510+0000", 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "last_deep_scrub": "172'41", 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "last_deep_scrub_stamp": "2026-03-31T20:31:21.304517+0000", 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "last_clean_scrub_stamp": "2026-03-31T20:31:22.320510+0000", 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "prior_readable_until_ub": 0 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "stats": { 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "version": "0'0", 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "reported_seq": 0, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "reported_epoch": 0, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "state": "unknown", 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "last_fresh": "0.000000", 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "last_change": "0.000000", 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "last_active": "0.000000", 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "last_peered": "0.000000", 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "last_clean": "0.000000", 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "last_became_active": "0.000000", 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "last_became_peered": "0.000000", 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "last_unstale": "0.000000", 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "last_undegraded": "0.000000", 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "last_fullsized": "0.000000", 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "mapping_epoch": 0, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "log_start": "0'0", 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "ondisk_log_start": "0'0", 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "created": 0, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "last_epoch_clean": 0, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "parent": "0.0", 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "parent_split_bits": 0, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "last_scrub": "0'0", 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "last_scrub_stamp": "0.000000", 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "last_deep_scrub": "0'0", 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "last_deep_scrub_stamp": "0.000000", 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "last_clean_scrub_stamp": "0.000000", 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "objects_scrubbed": 0, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "log_size": 0, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "log_dups_size": 0, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "ondisk_log_size": 0, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "stats_invalid": false, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "dirty_stats_invalid": false, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "omap_stats_invalid": false, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "hitset_stats_invalid": false, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "hitset_bytes_stats_invalid": false, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "pin_stats_invalid": false, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "manifest_stats_invalid": false, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "snaptrimq_len": 0, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "last_scrub_duration": 0, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_schedule": "--", 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_duration": 0, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "objects_trimmed": 0, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "snaptrim_duration": 0, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "stat_sum": { 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "num_bytes": 0, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects": 0, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "num_object_clones": 0, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "num_object_copies": 0, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_missing_on_primary": 0, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_missing": 0, 2026-03-31T20:32:13.657 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_degraded": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_misplaced": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_unfound": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_dirty": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_whiteouts": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_read": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_read_kb": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_write": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_write_kb": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrub_errors": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_shallow_scrub_errors": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_deep_scrub_errors": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_recovered": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_bytes_recovered": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_keys_recovered": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_omap": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_hit_set_archive": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_bytes_hit_set_archive": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_flush": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_flush_kb": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_evict": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_evict_kb": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_promote": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_flush_mode_high": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_flush_mode_low": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_evict_mode_some": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_evict_mode_full": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_pinned": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_legacy_snapsets": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_large_omap_objects": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_manifest": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_omap_bytes": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_omap_keys": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_repaired": 0 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "up": [], 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "acting": [], 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "avail_no_missing": [], 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "object_location_counts": [], 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "blocked_by": [], 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "up_primary": -1, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "acting_primary": -1, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "purged_snaps": [] 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "empty": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "dne": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "incomplete": 0, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "last_epoch_started": 328, 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "hit_set_history": { 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "current_last_update": "0'0", 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "history": [] 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "recovery_state": [ 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "name": "Started/Primary/Active", 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "enter_time": "2026-03-31T20:29:15.910414+0000", 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "might_have_unfound": [ 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "osd": "0", 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: "status": "not queried" 2026-03-31T20:32:13.658 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: "osd": "2", 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: "status": "already probed" 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: "recovery_progress": { 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: "backfill_targets": [], 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: "waiting_on_backfill": [], 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: "last_backfill_started": "MIN", 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: "backfill_info": { 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: "begin": "MIN", 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: "end": "MIN", 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: "objects": [] 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: "peer_backfill_info": [], 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: "backfills_in_flight": [], 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: "recovering": [], 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: "pg_backend": { 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: "pull_from_peer": [], 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: "pushing": [] 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: "name": "Started", 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: "enter_time": "2026-03-31T20:29:14.896600+0000" 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: "scrubber": { 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: "active": false, 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: "must_scrub": false, 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: "must_deep_scrub": false, 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: "must_repair": false, 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reg_stamp": "2026-04-01T21:14:15.870060+0000", 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: "schedule": "scrub scheduled @ 2026-04-01T21:14:15.870 (2026-04-01T21:14:15.870)" 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout: "agent_state": {} 2026-03-31T20:32:13.659 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:32:13.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2234: test_mon_pg: ceph tell 1.0 query 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout: "snap_trimq": "[]", 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout: "snap_trimq_len": 0, 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout: "state": "active+clean", 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout: "epoch": 449, 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout: "up": [ 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout: 1, 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout: 2 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout: "acting": [ 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout: 1, 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout: 2 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout: "acting_recovery_backfill": [ 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout: "1", 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout: "2" 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout: "info": { 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout: "pgid": "1.0", 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout: "shard": "255", 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout: "last_update": "172'41", 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout: "last_complete": "172'41", 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout: "log_tail": "0'0", 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout: "last_user_version": 39, 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout: "last_backfill": "MAX", 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout: "partial_writes_last_complete": [], 2026-03-31T20:32:13.743 INFO:tasks.workunit.client.0.vm03.stdout: "partial_writes_last_complete_epoch": "0", 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "purged_snaps": [], 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "history": { 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "epoch_created": 9, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "epoch_pool_created": 9, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_epoch_started": 328, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_interval_started": 327, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_epoch_clean": 328, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_interval_clean": 327, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_epoch_split": 0, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_epoch_marked_full": 448, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "same_up_since": 327, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "same_interval_since": 327, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "same_primary_since": 9, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_scrub": "172'41", 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_scrub_stamp": "2026-03-31T20:31:22.320510+0000", 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_deep_scrub": "172'41", 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_deep_scrub_stamp": "2026-03-31T20:31:21.304517+0000", 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_clean_scrub_stamp": "2026-03-31T20:31:22.320510+0000", 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "prior_readable_until_ub": 0 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "stats": { 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "version": "172'41", 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "reported_seq": 1071, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "reported_epoch": 449, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "state": "active+clean", 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_fresh": "2026-03-31T20:32:10.735600+0000", 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_change": "2026-03-31T20:31:22.320553+0000", 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_active": "2026-03-31T20:32:10.735600+0000", 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_peered": "2026-03-31T20:32:10.735600+0000", 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_clean": "2026-03-31T20:32:10.735600+0000", 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_became_active": "2026-03-31T20:29:15.913130+0000", 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_became_peered": "2026-03-31T20:29:15.913130+0000", 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_unstale": "2026-03-31T20:32:10.735600+0000", 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_undegraded": "2026-03-31T20:32:10.735600+0000", 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_fullsized": "2026-03-31T20:32:10.735600+0000", 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "mapping_epoch": 327, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "log_start": "0'0", 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "ondisk_log_start": "0'0", 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "created": 9, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_epoch_clean": 328, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "parent": "0.0", 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "parent_split_bits": 0, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_scrub": "172'41", 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_scrub_stamp": "2026-03-31T20:31:22.320510+0000", 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_deep_scrub": "172'41", 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_deep_scrub_stamp": "2026-03-31T20:31:21.304517+0000", 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_clean_scrub_stamp": "2026-03-31T20:31:22.320510+0000", 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "objects_scrubbed": 2, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "log_size": 41, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "log_dups_size": 0, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "ondisk_log_size": 41, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "stats_invalid": false, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "dirty_stats_invalid": false, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "omap_stats_invalid": false, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "hitset_stats_invalid": false, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "hitset_bytes_stats_invalid": false, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "pin_stats_invalid": false, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "manifest_stats_invalid": false, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "snaptrimq_len": 0, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "last_scrub_duration": 0, 2026-03-31T20:32:13.744 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_schedule": "periodic scrub scheduled @ 2026-04-01T21:14:15.870060+0000", 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_duration": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "objects_trimmed": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "snaptrim_duration": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "stat_sum": { 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_bytes": 459280, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects": 2, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_object_clones": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_object_copies": 4, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_missing_on_primary": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_missing": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_degraded": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_misplaced": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_unfound": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_dirty": 2, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_whiteouts": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_read": 106, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_read_kb": 213, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_write": 69, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_write_kb": 584, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrub_errors": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_shallow_scrub_errors": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_deep_scrub_errors": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_recovered": 6, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_bytes_recovered": 1377840, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_keys_recovered": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_omap": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_hit_set_archive": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_bytes_hit_set_archive": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_flush": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_flush_kb": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_evict": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_evict_kb": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_promote": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_flush_mode_high": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_flush_mode_low": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_evict_mode_some": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_evict_mode_full": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_pinned": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_legacy_snapsets": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_large_omap_objects": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_manifest": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_omap_bytes": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_omap_keys": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_repaired": 0 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "up": [ 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: 1, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: 2 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "acting": [ 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: 1, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: 2 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "avail_no_missing": [], 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "object_location_counts": [], 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "blocked_by": [], 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "up_primary": 1, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "acting_primary": 1, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "purged_snaps": [] 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "empty": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "dne": 0, 2026-03-31T20:32:13.745 INFO:tasks.workunit.client.0.vm03.stdout: "incomplete": 0, 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_epoch_started": 328, 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "hit_set_history": { 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "current_last_update": "0'0", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "history": [] 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "peer_info": [ 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "peer": "2", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "pgid": "1.0", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "shard": "255", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_update": "172'41", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_complete": "0'0", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "log_tail": "0'0", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_user_version": 0, 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_backfill": "MAX", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "partial_writes_last_complete": [], 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "partial_writes_last_complete_epoch": "0", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "purged_snaps": [], 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "history": { 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "epoch_created": 9, 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "epoch_pool_created": 9, 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_epoch_started": 328, 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_interval_started": 327, 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_epoch_clean": 328, 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_interval_clean": 327, 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_epoch_split": 0, 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_epoch_marked_full": 0, 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "same_up_since": 0, 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "same_interval_since": 0, 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "same_primary_since": 0, 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_scrub": "172'41", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_scrub_stamp": "2026-03-31T20:31:22.320510+0000", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_deep_scrub": "172'41", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_deep_scrub_stamp": "2026-03-31T20:31:21.304517+0000", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_clean_scrub_stamp": "2026-03-31T20:31:22.320510+0000", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "prior_readable_until_ub": 0 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "stats": { 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "version": "0'0", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "reported_seq": 0, 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "reported_epoch": 0, 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "state": "unknown", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_fresh": "0.000000", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_change": "0.000000", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_active": "0.000000", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_peered": "0.000000", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_clean": "0.000000", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_became_active": "0.000000", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_became_peered": "0.000000", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_unstale": "0.000000", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_undegraded": "0.000000", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_fullsized": "0.000000", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "mapping_epoch": 0, 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "log_start": "0'0", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "ondisk_log_start": "0'0", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "created": 0, 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_epoch_clean": 0, 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "parent": "0.0", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "parent_split_bits": 0, 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_scrub": "0'0", 2026-03-31T20:32:13.746 INFO:tasks.workunit.client.0.vm03.stdout: "last_scrub_stamp": "0.000000", 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "last_deep_scrub": "0'0", 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "last_deep_scrub_stamp": "0.000000", 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "last_clean_scrub_stamp": "0.000000", 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "objects_scrubbed": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "log_size": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "log_dups_size": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "ondisk_log_size": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "stats_invalid": false, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "dirty_stats_invalid": false, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "omap_stats_invalid": false, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "hitset_stats_invalid": false, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "hitset_bytes_stats_invalid": false, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "pin_stats_invalid": false, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "manifest_stats_invalid": false, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "snaptrimq_len": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "last_scrub_duration": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_schedule": "--", 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_duration": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "objects_trimmed": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "snaptrim_duration": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "stat_sum": { 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_bytes": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_object_clones": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_object_copies": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_missing_on_primary": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_missing": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_degraded": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_misplaced": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_unfound": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_dirty": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_whiteouts": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_read": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_read_kb": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_write": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_write_kb": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrub_errors": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_shallow_scrub_errors": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_deep_scrub_errors": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_recovered": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_bytes_recovered": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_keys_recovered": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_omap": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_hit_set_archive": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_bytes_hit_set_archive": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_flush": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_flush_kb": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_evict": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_evict_kb": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_promote": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_flush_mode_high": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_flush_mode_low": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_evict_mode_some": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_evict_mode_full": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_pinned": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_legacy_snapsets": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_large_omap_objects": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_manifest": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_omap_bytes": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_omap_keys": 0, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_repaired": 0 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:32:13.747 INFO:tasks.workunit.client.0.vm03.stdout: "up": [], 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "acting": [], 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "avail_no_missing": [], 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "object_location_counts": [], 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "blocked_by": [], 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "up_primary": -1, 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "acting_primary": -1, 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "purged_snaps": [] 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "empty": 0, 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "dne": 0, 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "incomplete": 0, 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "last_epoch_started": 328, 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "hit_set_history": { 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "current_last_update": "0'0", 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "history": [] 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "recovery_state": [ 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "name": "Started/Primary/Active", 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "enter_time": "2026-03-31T20:29:15.910414+0000", 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "might_have_unfound": [ 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "osd": "0", 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "status": "not queried" 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "osd": "2", 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "status": "already probed" 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "recovery_progress": { 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "backfill_targets": [], 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "waiting_on_backfill": [], 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "last_backfill_started": "MIN", 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "backfill_info": { 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "begin": "MIN", 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "end": "MIN", 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "objects": [] 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "peer_backfill_info": [], 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "backfills_in_flight": [], 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "recovering": [], 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "pg_backend": { 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "pull_from_peer": [], 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "pushing": [] 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "name": "Started", 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "enter_time": "2026-03-31T20:29:14.896600+0000" 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "scrubber": { 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "active": false, 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "must_scrub": false, 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "must_deep_scrub": false, 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "must_repair": false, 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reg_stamp": "2026-04-01T21:14:15.870060+0000", 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: "schedule": "scrub scheduled @ 2026-04-01T21:14:15.870 (2026-04-01T21:14:15.870)" 2026-03-31T20:32:13.748 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:32:13.749 INFO:tasks.workunit.client.0.vm03.stdout: "agent_state": {} 2026-03-31T20:32:13.749 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:32:13.754 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2235: test_mon_pg: ceph mon dump -f json 2026-03-31T20:32:13.754 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2235: test_mon_pg: jq -r '.mons[0].name' 2026-03-31T20:32:14.013 INFO:tasks.workunit.client.0.vm03.stderr:dumped monmap epoch 6 2026-03-31T20:32:14.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2235: test_mon_pg: first=a 2026-03-31T20:32:14.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2236: test_mon_pg: ceph tell mon.a quorum enter 2026-03-31T20:32:14.097 INFO:tasks.workunit.client.0.vm03.stdout:started responding to quorum, initiated new election 2026-03-31T20:32:14.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2237: test_mon_pg: ceph quorum_status 2026-03-31T20:32:14.366 INFO:tasks.workunit.client.0.vm03.stdout:{"election_epoch":34,"quorum":[0,1,2],"quorum_names":["a","b","c"],"quorum_leader_name":"a","quorum_age":0,"features":{"quorum_con":"4541880224203014143","quorum_mon":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef","squid","tentacle"]},"monmap":{"epoch":6,"fsid":"a4a0ca01-ae82-443e-a7c7-50605716689a","modified":"2026-03-31T20:25:41.765560Z","created":"2026-03-31T20:21:18.374590Z","min_mon_release":20,"min_mon_release_name":"tentacle","election_strategy":1,"disallowed_leaders":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef","squid","tentacle"],"optional":[]},"mons":[{"rank":0,"name":"a","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3300","nonce":0},{"type":"v1","addr":"192.168.123.103:6789","nonce":0}]},"addr":"192.168.123.103:6789/0","public_addr":"192.168.123.103:6789/0","priority":0,"weight":0,"crush_location":"{}"},{"rank":1,"name":"b","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3301","nonce":0},{"type":"v1","addr":"192.168.123.103:6790","nonce":0}]},"addr":"192.168.123.103:6790/0","public_addr":"192.168.123.103:6790/0","priority":0,"weight":0,"crush_location":"{}"},{"rank":2,"name":"c","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:3302","nonce":0},{"type":"v1","addr":"192.168.123.103:6791","nonce":0}]},"addr":"192.168.123.103:6791/0","public_addr":"192.168.123.103:6791/0","priority":0,"weight":0,"crush_location":"{}"}]}} 2026-03-31T20:32:14.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2238: test_mon_pg: ceph report 2026-03-31T20:32:14.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2238: test_mon_pg: grep osd_stats 2026-03-31T20:32:14.637 INFO:tasks.workunit.client.0.vm03.stderr:report 839635622 2026-03-31T20:32:14.650 INFO:tasks.workunit.client.0.vm03.stdout: "osd_stats": [ 2026-03-31T20:32:14.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2239: test_mon_pg: ceph status 2026-03-31T20:32:14.907 INFO:tasks.workunit.client.0.vm03.stdout: cluster: 2026-03-31T20:32:14.907 INFO:tasks.workunit.client.0.vm03.stdout: id: a4a0ca01-ae82-443e-a7c7-50605716689a 2026-03-31T20:32:14.907 INFO:tasks.workunit.client.0.vm03.stdout: health: HEALTH_OK 2026-03-31T20:32:14.907 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:32:14.907 INFO:tasks.workunit.client.0.vm03.stdout: services: 2026-03-31T20:32:14.907 INFO:tasks.workunit.client.0.vm03.stdout: mon: 3 daemons, quorum a,b,c (age 0.808526s) [leader: a] 2026-03-31T20:32:14.907 INFO:tasks.workunit.client.0.vm03.stdout: mgr: x(active, since 6m) 2026-03-31T20:32:14.907 INFO:tasks.workunit.client.0.vm03.stdout: osd: 3 osds: 3 up (since 4m), 3 in (since 2m) 2026-03-31T20:32:14.907 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:32:14.907 INFO:tasks.workunit.client.0.vm03.stdout: data: 2026-03-31T20:32:14.907 INFO:tasks.workunit.client.0.vm03.stdout: pools: 2 pools, 9 pgs 2026-03-31T20:32:14.907 INFO:tasks.workunit.client.0.vm03.stdout: objects: 4 objects, 449 KiB 2026-03-31T20:32:14.907 INFO:tasks.workunit.client.0.vm03.stdout: usage: 90 MiB used, 270 GiB / 270 GiB avail 2026-03-31T20:32:14.907 INFO:tasks.workunit.client.0.vm03.stdout: pgs: 9 active+clean 2026-03-31T20:32:14.907 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:32:14.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2240: test_mon_pg: ceph -s 2026-03-31T20:32:15.171 INFO:tasks.workunit.client.0.vm03.stdout: cluster: 2026-03-31T20:32:15.171 INFO:tasks.workunit.client.0.vm03.stdout: id: a4a0ca01-ae82-443e-a7c7-50605716689a 2026-03-31T20:32:15.171 INFO:tasks.workunit.client.0.vm03.stdout: health: HEALTH_OK 2026-03-31T20:32:15.171 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:32:15.171 INFO:tasks.workunit.client.0.vm03.stdout: services: 2026-03-31T20:32:15.171 INFO:tasks.workunit.client.0.vm03.stdout: mon: 3 daemons, quorum a,b,c (age 1.07321s) [leader: a] 2026-03-31T20:32:15.171 INFO:tasks.workunit.client.0.vm03.stdout: mgr: x(active, since 6m) 2026-03-31T20:32:15.171 INFO:tasks.workunit.client.0.vm03.stdout: osd: 3 osds: 3 up (since 4m), 3 in (since 2m) 2026-03-31T20:32:15.171 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:32:15.171 INFO:tasks.workunit.client.0.vm03.stdout: data: 2026-03-31T20:32:15.171 INFO:tasks.workunit.client.0.vm03.stdout: pools: 2 pools, 9 pgs 2026-03-31T20:32:15.171 INFO:tasks.workunit.client.0.vm03.stdout: objects: 4 objects, 449 KiB 2026-03-31T20:32:15.171 INFO:tasks.workunit.client.0.vm03.stdout: usage: 90 MiB used, 270 GiB / 270 GiB avail 2026-03-31T20:32:15.171 INFO:tasks.workunit.client.0.vm03.stdout: pgs: 9 active+clean 2026-03-31T20:32:15.171 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:32:15.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2245: test_mon_pg: ceph tell osd.0 version 2026-03-31T20:32:15.258 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:32:15.258 INFO:tasks.workunit.client.0.vm03.stdout: "version": "20.2.0-721-g5bb32787", 2026-03-31T20:32:15.258 INFO:tasks.workunit.client.0.vm03.stdout: "release": "tentacle", 2026-03-31T20:32:15.258 INFO:tasks.workunit.client.0.vm03.stdout: "release_type": "stable" 2026-03-31T20:32:15.258 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:32:15.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2246: test_mon_pg: expect_false ceph tell osd.9999 version 2026-03-31T20:32:15.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:32:15.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph tell osd.9999 version 2026-03-31T20:32:15.324 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: problem getting command descriptions from osd.9999 2026-03-31T20:32:15.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:32:15.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2247: test_mon_pg: expect_false ceph tell osd.foo version 2026-03-31T20:32:15.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:32:15.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph tell osd.foo version 2026-03-31T20:32:15.383 INFO:tasks.workunit.client.0.vm03.stderr:error handling command target: osd id foo not integer 2026-03-31T20:32:15.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:32:15.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2251: test_mon_pg: ceph tell osd.0 dump_pg_recovery_stats 2026-03-31T20:32:15.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2251: test_mon_pg: grep Started 2026-03-31T20:32:15.466 INFO:tasks.workunit.client.0.vm03.stdout: "Started" 2026-03-31T20:32:15.466 INFO:tasks.workunit.client.0.vm03.stdout: "Started", 2026-03-31T20:32:15.466 INFO:tasks.workunit.client.0.vm03.stdout: "Started", 2026-03-31T20:32:15.466 INFO:tasks.workunit.client.0.vm03.stdout: "Started", 2026-03-31T20:32:15.466 INFO:tasks.workunit.client.0.vm03.stdout: "Started", 2026-03-31T20:32:15.466 INFO:tasks.workunit.client.0.vm03.stdout: "Started", 2026-03-31T20:32:15.466 INFO:tasks.workunit.client.0.vm03.stdout: "Started", 2026-03-31T20:32:15.466 INFO:tasks.workunit.client.0.vm03.stdout: "Started", 2026-03-31T20:32:15.466 INFO:tasks.workunit.client.0.vm03.stdout: "Started", 2026-03-31T20:32:15.466 INFO:tasks.workunit.client.0.vm03.stdout: "Started", 2026-03-31T20:32:15.466 INFO:tasks.workunit.client.0.vm03.stdout: "Started", 2026-03-31T20:32:15.466 INFO:tasks.workunit.client.0.vm03.stdout: "Started", 2026-03-31T20:32:15.466 INFO:tasks.workunit.client.0.vm03.stdout: "Started", 2026-03-31T20:32:15.466 INFO:tasks.workunit.client.0.vm03.stdout: "Started", 2026-03-31T20:32:15.466 INFO:tasks.workunit.client.0.vm03.stdout: "Started", 2026-03-31T20:32:15.466 INFO:tasks.workunit.client.0.vm03.stdout: "Started", 2026-03-31T20:32:15.466 INFO:tasks.workunit.client.0.vm03.stdout: "Started", 2026-03-31T20:32:15.466 INFO:tasks.workunit.client.0.vm03.stdout: "Started", 2026-03-31T20:32:15.466 INFO:tasks.workunit.client.0.vm03.stdout: "Started", 2026-03-31T20:32:15.466 INFO:tasks.workunit.client.0.vm03.stdout: "Started", 2026-03-31T20:32:15.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2253: test_mon_pg: ceph osd reweight 0 0.9 2026-03-31T20:32:17.188 INFO:tasks.workunit.client.0.vm03.stderr:reweighted osd.0 to 0.9 (e666) 2026-03-31T20:32:17.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2254: test_mon_pg: expect_false ceph osd reweight 0 -1 2026-03-31T20:32:17.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:32:17.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd reweight 0 -1 2026-03-31T20:32:17.365 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: -1.0 not in range [0.0, 1.0] 2026-03-31T20:32:17.365 INFO:tasks.workunit.client.0.vm03.stderr:osd reweight : reweight osd to 0.0 < < 1.0 2026-03-31T20:32:17.365 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:32:17.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:32:17.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2255: test_mon_pg: ceph osd reweight osd.0 1 2026-03-31T20:32:19.211 INFO:tasks.workunit.client.0.vm03.stderr:reweighted osd.0 to 1 (10000) 2026-03-31T20:32:19.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2257: test_mon_pg: ceph osd primary-affinity osd.0 .9 2026-03-31T20:32:21.227 INFO:tasks.workunit.client.0.vm03.stderr:set osd.0 primary-affinity to 0.9 (8589822) 2026-03-31T20:32:21.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2258: test_mon_pg: expect_false ceph osd primary-affinity osd.0 -2 2026-03-31T20:32:21.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:32:21.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd primary-affinity osd.0 -2 2026-03-31T20:32:21.393 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: -2.0 not in range [0.0, 1.0] 2026-03-31T20:32:21.393 INFO:tasks.workunit.client.0.vm03.stderr:osd primary-affinity : adjust osd primary-affinity from 0.0 <= <= 1.0 2026-03-31T20:32:21.393 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:32:21.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:32:21.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2259: test_mon_pg: expect_false ceph osd primary-affinity osd.9999 .5 2026-03-31T20:32:21.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:32:21.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd primary-affinity osd.9999 .5 2026-03-31T20:32:21.550 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: osd.9999 does not exist 2026-03-31T20:32:21.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:32:21.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2260: test_mon_pg: ceph osd primary-affinity osd.0 1 2026-03-31T20:32:23.249 INFO:tasks.workunit.client.0.vm03.stderr:set osd.0 primary-affinity to 1 (8655362) 2026-03-31T20:32:23.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2262: test_mon_pg: ceph osd pool set rbd size 2 2026-03-31T20:32:25.261 INFO:tasks.workunit.client.0.vm03.stderr:set pool 2 size to 2 2026-03-31T20:32:25.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2263: test_mon_pg: ceph osd pg-temp 1.0 0 1 2026-03-31T20:32:28.272 INFO:tasks.workunit.client.0.vm03.stderr:set 1.0 pg_temp mapping to [0,1] 2026-03-31T20:32:28.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2264: test_mon_pg: ceph osd pg-temp 1.0 osd.1 osd.0 2026-03-31T20:32:30.834 INFO:tasks.workunit.client.0.vm03.stderr:set 1.0 pg_temp mapping to [1,0] 2026-03-31T20:32:30.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2265: test_mon_pg: expect_false ceph osd pg-temp 1.0 0 1 2 2026-03-31T20:32:30.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:32:30.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pg-temp 1.0 0 1 2 2026-03-31T20:32:30.997 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: num of osds (3) > pool size (2) 2026-03-31T20:32:31.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:32:31.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2266: test_mon_pg: expect_false ceph osd pg-temp asdf qwer 2026-03-31T20:32:31.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:32:31.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pg-temp asdf qwer 2026-03-31T20:32:31.149 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: pgid has no . 2026-03-31T20:32:31.149 INFO:tasks.workunit.client.0.vm03.stderr:osd pg-temp [...] : set pg_temp mapping :[ [...]] (developers only) 2026-03-31T20:32:31.149 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:32:31.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:32:31.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2267: test_mon_pg: expect_false ceph osd pg-temp 1.0 asdf 2026-03-31T20:32:31.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:32:31.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pg-temp 1.0 asdf 2026-03-31T20:32:31.300 INFO:tasks.workunit.client.0.vm03.stderr:asdf not valid: osd id asdf not integer 2026-03-31T20:32:31.300 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: unused arguments: ['asdf'] 2026-03-31T20:32:31.300 INFO:tasks.workunit.client.0.vm03.stderr:osd pg-temp [...] : set pg_temp mapping :[ [...]] (developers only) 2026-03-31T20:32:31.300 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:32:31.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:32:31.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2268: test_mon_pg: ceph osd pg-temp 1.0 2026-03-31T20:32:32.851 INFO:tasks.workunit.client.0.vm03.stderr:done cleaning up pg_temp of 1.0 2026-03-31T20:32:32.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2270: test_mon_pg: ceph pg repeer 1.0 2026-03-31T20:32:34.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2271: test_mon_pg: expect_false ceph pg repeer 0.0 2026-03-31T20:32:34.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:32:34.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph pg repeer 0.0 2026-03-31T20:32:34.991 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: pgid '0.0' does not exist 2026-03-31T20:32:34.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:32:34.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:32:35.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mon_osd_pool_set 2026-03-31T20:32:35.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2278: test_mon_osd_pool_set: TEST_POOL_GETSET=pool_getset 2026-03-31T20:32:35.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2279: test_mon_osd_pool_set: expect_false ceph osd pool create pool_getset 1 --target_size_ratio -0.3 2026-03-31T20:32:35.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:32:35.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool create pool_getset 1 --target_size_ratio -0.3 2026-03-31T20:32:35.341 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: -0.3 not in range [0.0] 2026-03-31T20:32:35.341 INFO:tasks.workunit.client.0.vm03.stderr:osd pool create [] [] [] [] [] [] [] [] [] [] [--bulk] [] [] [--yes-i-really-mean-it] : create pool 2026-03-31T20:32:35.341 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:32:35.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:32:35.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2280: test_mon_osd_pool_set: expect_true ceph osd pool create pool_getset 1 --target_size_ratio 1 2026-03-31T20:32:35.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:32:35.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: ceph osd pool create pool_getset 1 --target_size_ratio 1 2026-03-31T20:32:35.888 INFO:tasks.workunit.client.0.vm03.stderr:pool 'pool_getset' already exists 2026-03-31T20:32:35.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:32:35.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2281: test_mon_osd_pool_set: ceph osd pool application enable pool_getset rados 2026-03-31T20:32:37.841 INFO:tasks.workunit.client.0.vm03.stderr:enabled application 'rados' on pool 'pool_getset' 2026-03-31T20:32:37.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2282: test_mon_osd_pool_set: ceph osd pool set pool_getset pg_autoscale_mode off 2026-03-31T20:32:39.887 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 pg_autoscale_mode to off 2026-03-31T20:32:39.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2283: test_mon_osd_pool_set: wait_for_clean 2026-03-31T20:32:39.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1652: wait_for_clean: local cmd= 2026-03-31T20:32:39.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1653: wait_for_clean: local num_active_clean=-1 2026-03-31T20:32:39.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1654: wait_for_clean: local cur_active_clean 2026-03-31T20:32:39.900 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1655: wait_for_clean: get_timeout_delays 90 .1 2026-03-31T20:32:39.900 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: shopt -q -o xtrace 2026-03-31T20:32:39.901 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: echo true 2026-03-31T20:32:39.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: local trace=true 2026-03-31T20:32:39.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: true 2026-03-31T20:32:39.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: shopt -u -o xtrace 2026-03-31T20:32:39.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1655: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-31T20:32:39.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1655: wait_for_clean: local -a delays 2026-03-31T20:32:39.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1656: wait_for_clean: local -i loop=0 2026-03-31T20:32:39.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1658: wait_for_clean: flush_pg_stats 2026-03-31T20:32:39.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2258: flush_pg_stats: local timeout=300 2026-03-31T20:32:39.970 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2260: flush_pg_stats: ceph osd ls 2026-03-31T20:32:40.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2260: flush_pg_stats: ids='0 2026-03-31T20:32:40.180 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-31T20:32:40.180 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-31T20:32:40.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2261: flush_pg_stats: seqs= 2026-03-31T20:32:40.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:32:40.180 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.0 flush_pg_stats 2026-03-31T20:32:40.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=949187772560 2026-03-31T20:32:40.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 949187772560 2026-03-31T20:32:40.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-949187772560' 2026-03-31T20:32:40.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:32:40.261 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.1 flush_pg_stats 2026-03-31T20:32:40.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=34359738512 2026-03-31T20:32:40.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 34359738512 2026-03-31T20:32:40.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-949187772560 1-34359738512' 2026-03-31T20:32:40.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:32:40.342 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.2 flush_pg_stats 2026-03-31T20:32:40.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=34359738512 2026-03-31T20:32:40.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 34359738512 2026-03-31T20:32:40.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-949187772560 1-34359738512 2-34359738512' 2026-03-31T20:32:40.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:32:40.422 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 0-949187772560 2026-03-31T20:32:40.422 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:32:40.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=0 2026-03-31T20:32:40.424 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 0-949187772560 2026-03-31T20:32:40.424 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:32:40.425 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 949187772560 2026-03-31T20:32:40.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=949187772560 2026-03-31T20:32:40.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.0 seq 949187772560' 2026-03-31T20:32:40.425 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-31T20:32:40.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 949187772558 -lt 949187772560 2026-03-31T20:32:40.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2275: flush_pg_stats: sleep 1 2026-03-31T20:32:41.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2276: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-31T20:32:41.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-31T20:32:41.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 949187772558 -lt 949187772560 2026-03-31T20:32:41.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2275: flush_pg_stats: sleep 1 2026-03-31T20:32:42.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2276: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-31T20:32:42.835 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-31T20:32:43.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 949187772560 -lt 949187772560 2026-03-31T20:32:43.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:32:43.037 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 1-34359738512 2026-03-31T20:32:43.037 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:32:43.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=1 2026-03-31T20:32:43.039 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 1-34359738512 2026-03-31T20:32:43.039 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:32:43.040 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 34359738512 2026-03-31T20:32:43.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=34359738512 2026-03-31T20:32:43.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.1 seq 34359738512' 2026-03-31T20:32:43.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-31T20:32:43.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738512 -lt 34359738512 2026-03-31T20:32:43.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:32:43.240 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 2-34359738512 2026-03-31T20:32:43.240 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:32:43.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=2 2026-03-31T20:32:43.241 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 2-34359738512 2026-03-31T20:32:43.241 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:32:43.242 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 34359738512 2026-03-31T20:32:43.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=34359738512 2026-03-31T20:32:43.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.2 seq 34359738512' 2026-03-31T20:32:43.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-31T20:32:43.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738512 -lt 34359738512 2026-03-31T20:32:43.443 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1659: wait_for_clean: get_num_pgs 2026-03-31T20:32:43.443 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: ceph --format json status 2026-03-31T20:32:43.443 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: jq .pgmap.num_pgs 2026-03-31T20:32:43.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1659: wait_for_clean: test 10 == 0 2026-03-31T20:32:43.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1663: wait_for_clean: true 2026-03-31T20:32:43.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1667: wait_for_clean: get_num_active_clean 2026-03-31T20:32:43.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1360: get_num_active_clean: local expression 2026-03-31T20:32:43.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1361: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-31T20:32:43.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1362: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-31T20:32:43.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1363: get_num_active_clean: ceph --format json pg dump pgs 2026-03-31T20:32:43.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1364: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-31T20:32:43.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1667: wait_for_clean: cur_active_clean=10 2026-03-31T20:32:43.907 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1668: wait_for_clean: get_num_pgs 2026-03-31T20:32:43.907 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: ceph --format json status 2026-03-31T20:32:43.907 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: jq .pgmap.num_pgs 2026-03-31T20:32:44.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1668: wait_for_clean: test 10 = 10 2026-03-31T20:32:44.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1668: wait_for_clean: break 2026-03-31T20:32:44.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1683: wait_for_clean: return 0 2026-03-31T20:32:44.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2284: test_mon_osd_pool_set: ceph osd pool get pool_getset all 2026-03-31T20:32:44.359 INFO:tasks.workunit.client.0.vm03.stdout:size: 2 2026-03-31T20:32:44.359 INFO:tasks.workunit.client.0.vm03.stdout:min_size: 1 2026-03-31T20:32:44.359 INFO:tasks.workunit.client.0.vm03.stdout:pg_num: 1 2026-03-31T20:32:44.359 INFO:tasks.workunit.client.0.vm03.stdout:pgp_num: 1 2026-03-31T20:32:44.359 INFO:tasks.workunit.client.0.vm03.stdout:crush_rule: replicated_rule 2026-03-31T20:32:44.359 INFO:tasks.workunit.client.0.vm03.stdout:hashpspool: true 2026-03-31T20:32:44.359 INFO:tasks.workunit.client.0.vm03.stdout:nodelete: false 2026-03-31T20:32:44.359 INFO:tasks.workunit.client.0.vm03.stdout:nopgchange: false 2026-03-31T20:32:44.359 INFO:tasks.workunit.client.0.vm03.stdout:nosizechange: false 2026-03-31T20:32:44.359 INFO:tasks.workunit.client.0.vm03.stdout:write_fadvise_dontneed: false 2026-03-31T20:32:44.359 INFO:tasks.workunit.client.0.vm03.stdout:noscrub: false 2026-03-31T20:32:44.359 INFO:tasks.workunit.client.0.vm03.stdout:nodeep-scrub: false 2026-03-31T20:32:44.359 INFO:tasks.workunit.client.0.vm03.stdout:use_gmt_hitset: 1 2026-03-31T20:32:44.359 INFO:tasks.workunit.client.0.vm03.stdout:fast_read: 0 2026-03-31T20:32:44.359 INFO:tasks.workunit.client.0.vm03.stdout:pg_autoscale_mode: off 2026-03-31T20:32:44.359 INFO:tasks.workunit.client.0.vm03.stdout:target_size_ratio: 1 2026-03-31T20:32:44.359 INFO:tasks.workunit.client.0.vm03.stdout:eio: false 2026-03-31T20:32:44.359 INFO:tasks.workunit.client.0.vm03.stdout:bulk: false 2026-03-31T20:32:44.369 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2286: test_mon_osd_pool_set: for s in pg_num pgp_num size min_size crush_rule target_size_ratio 2026-03-31T20:32:44.369 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2287: test_mon_osd_pool_set: ceph osd pool get pool_getset pg_num 2026-03-31T20:32:44.558 INFO:tasks.workunit.client.0.vm03.stdout:pg_num: 1 2026-03-31T20:32:44.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2286: test_mon_osd_pool_set: for s in pg_num pgp_num size min_size crush_rule target_size_ratio 2026-03-31T20:32:44.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2287: test_mon_osd_pool_set: ceph osd pool get pool_getset pgp_num 2026-03-31T20:32:44.759 INFO:tasks.workunit.client.0.vm03.stdout:pgp_num: 1 2026-03-31T20:32:44.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2286: test_mon_osd_pool_set: for s in pg_num pgp_num size min_size crush_rule target_size_ratio 2026-03-31T20:32:44.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2287: test_mon_osd_pool_set: ceph osd pool get pool_getset size 2026-03-31T20:32:44.962 INFO:tasks.workunit.client.0.vm03.stdout:size: 2 2026-03-31T20:32:44.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2286: test_mon_osd_pool_set: for s in pg_num pgp_num size min_size crush_rule target_size_ratio 2026-03-31T20:32:44.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2287: test_mon_osd_pool_set: ceph osd pool get pool_getset min_size 2026-03-31T20:32:45.168 INFO:tasks.workunit.client.0.vm03.stdout:min_size: 1 2026-03-31T20:32:45.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2286: test_mon_osd_pool_set: for s in pg_num pgp_num size min_size crush_rule target_size_ratio 2026-03-31T20:32:45.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2287: test_mon_osd_pool_set: ceph osd pool get pool_getset crush_rule 2026-03-31T20:32:45.368 INFO:tasks.workunit.client.0.vm03.stdout:crush_rule: replicated_rule 2026-03-31T20:32:45.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2286: test_mon_osd_pool_set: for s in pg_num pgp_num size min_size crush_rule target_size_ratio 2026-03-31T20:32:45.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2287: test_mon_osd_pool_set: ceph osd pool get pool_getset target_size_ratio 2026-03-31T20:32:45.567 INFO:tasks.workunit.client.0.vm03.stdout:target_size_ratio: 1 2026-03-31T20:32:45.578 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2290: test_mon_osd_pool_set: ceph osd pool get pool_getset size 2026-03-31T20:32:45.578 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2290: test_mon_osd_pool_set: sed -e 's/size: //' 2026-03-31T20:32:45.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2290: test_mon_osd_pool_set: old_size=2 2026-03-31T20:32:45.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2291: test_mon_osd_pool_set: (( new_size = old_size + 1 )) 2026-03-31T20:32:45.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2292: test_mon_osd_pool_set: ceph osd pool set pool_getset size 3 --yes-i-really-mean-it 2026-03-31T20:32:47.055 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 size to 3 2026-03-31T20:32:47.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2293: test_mon_osd_pool_set: ceph osd pool get pool_getset size 2026-03-31T20:32:47.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2293: test_mon_osd_pool_set: grep 'size: 3' 2026-03-31T20:32:47.272 INFO:tasks.workunit.client.0.vm03.stdout:size: 3 2026-03-31T20:32:47.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2294: test_mon_osd_pool_set: ceph osd pool set pool_getset size 2 --yes-i-really-mean-it 2026-03-31T20:32:49.070 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 size to 2 2026-03-31T20:32:49.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2296: test_mon_osd_pool_set: ceph osd pool create pool_erasure 1 1 erasure 2026-03-31T20:32:50.136 INFO:tasks.workunit.client.0.vm03.stderr:pool 'pool_erasure' already exists 2026-03-31T20:32:50.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2297: test_mon_osd_pool_set: ceph osd pool application enable pool_erasure rados 2026-03-31T20:32:52.100 INFO:tasks.workunit.client.0.vm03.stderr:enabled application 'rados' on pool 'pool_erasure' 2026-03-31T20:32:52.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2298: test_mon_osd_pool_set: wait_for_clean 2026-03-31T20:32:52.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1652: wait_for_clean: local cmd= 2026-03-31T20:32:52.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1653: wait_for_clean: local num_active_clean=-1 2026-03-31T20:32:52.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1654: wait_for_clean: local cur_active_clean 2026-03-31T20:32:52.115 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1655: wait_for_clean: get_timeout_delays 90 .1 2026-03-31T20:32:52.115 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: shopt -q -o xtrace 2026-03-31T20:32:52.115 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: echo true 2026-03-31T20:32:52.116 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: local trace=true 2026-03-31T20:32:52.116 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: true 2026-03-31T20:32:52.116 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: shopt -u -o xtrace 2026-03-31T20:32:52.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1655: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-31T20:32:52.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1655: wait_for_clean: local -a delays 2026-03-31T20:32:52.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1656: wait_for_clean: local -i loop=0 2026-03-31T20:32:52.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1658: wait_for_clean: flush_pg_stats 2026-03-31T20:32:52.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2258: flush_pg_stats: local timeout=300 2026-03-31T20:32:52.196 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2260: flush_pg_stats: ceph osd ls 2026-03-31T20:32:52.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2260: flush_pg_stats: ids='0 2026-03-31T20:32:52.397 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-31T20:32:52.397 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-31T20:32:52.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2261: flush_pg_stats: seqs= 2026-03-31T20:32:52.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:32:52.397 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.0 flush_pg_stats 2026-03-31T20:32:52.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=949187772564 2026-03-31T20:32:52.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 949187772564 2026-03-31T20:32:52.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-949187772564' 2026-03-31T20:32:52.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:32:52.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.1 flush_pg_stats 2026-03-31T20:32:52.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=34359738516 2026-03-31T20:32:52.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 34359738516 2026-03-31T20:32:52.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-949187772564 1-34359738516' 2026-03-31T20:32:52.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:32:52.562 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.2 flush_pg_stats 2026-03-31T20:32:52.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=34359738516 2026-03-31T20:32:52.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 34359738516 2026-03-31T20:32:52.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-949187772564 1-34359738516 2-34359738516' 2026-03-31T20:32:52.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:32:52.642 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 0-949187772564 2026-03-31T20:32:52.642 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:32:52.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=0 2026-03-31T20:32:52.643 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 0-949187772564 2026-03-31T20:32:52.644 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:32:52.645 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 949187772564 2026-03-31T20:32:52.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=949187772564 2026-03-31T20:32:52.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.0 seq 949187772564' 2026-03-31T20:32:52.645 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-31T20:32:52.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 949187772562 -lt 949187772564 2026-03-31T20:32:52.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2275: flush_pg_stats: sleep 1 2026-03-31T20:32:53.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2276: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-31T20:32:53.847 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-31T20:32:54.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 949187772562 -lt 949187772564 2026-03-31T20:32:54.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2275: flush_pg_stats: sleep 1 2026-03-31T20:32:55.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2276: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-31T20:32:55.050 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-31T20:32:55.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 949187772564 -lt 949187772564 2026-03-31T20:32:55.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:32:55.256 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 1-34359738516 2026-03-31T20:32:55.256 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:32:55.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=1 2026-03-31T20:32:55.258 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 1-34359738516 2026-03-31T20:32:55.258 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:32:55.259 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 34359738516 2026-03-31T20:32:55.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=34359738516 2026-03-31T20:32:55.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.1 seq 34359738516' 2026-03-31T20:32:55.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-31T20:32:55.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738516 -lt 34359738516 2026-03-31T20:32:55.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:32:55.461 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 2-34359738516 2026-03-31T20:32:55.462 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:32:55.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=2 2026-03-31T20:32:55.463 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 2-34359738516 2026-03-31T20:32:55.463 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:32:55.464 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 34359738516 2026-03-31T20:32:55.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=34359738516 2026-03-31T20:32:55.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.2 seq 34359738516' 2026-03-31T20:32:55.465 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-31T20:32:55.666 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738516 -lt 34359738516 2026-03-31T20:32:55.666 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1659: wait_for_clean: get_num_pgs 2026-03-31T20:32:55.666 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: ceph --format json status 2026-03-31T20:32:55.666 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: jq .pgmap.num_pgs 2026-03-31T20:32:55.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1659: wait_for_clean: test 11 == 0 2026-03-31T20:32:55.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1663: wait_for_clean: true 2026-03-31T20:32:55.928 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1667: wait_for_clean: get_num_active_clean 2026-03-31T20:32:55.928 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1360: get_num_active_clean: local expression 2026-03-31T20:32:55.928 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1361: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-31T20:32:55.929 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1362: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-31T20:32:55.929 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1363: get_num_active_clean: ceph --format json pg dump pgs 2026-03-31T20:32:55.929 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1364: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-31T20:32:56.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1667: wait_for_clean: cur_active_clean=11 2026-03-31T20:32:56.128 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1668: wait_for_clean: get_num_pgs 2026-03-31T20:32:56.129 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: ceph --format json status 2026-03-31T20:32:56.129 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: jq .pgmap.num_pgs 2026-03-31T20:32:56.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1668: wait_for_clean: test 11 = 11 2026-03-31T20:32:56.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1668: wait_for_clean: break 2026-03-31T20:32:56.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1683: wait_for_clean: return 0 2026-03-31T20:32:56.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2299: test_mon_osd_pool_set: set +e 2026-03-31T20:32:56.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2300: test_mon_osd_pool_set: ceph osd pool set pool_erasure size 4444 2026-03-31T20:32:56.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2301: test_mon_osd_pool_set: check_response 'not change the size' 2026-03-31T20:32:56.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='not change the size' 2026-03-31T20:32:56.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode= 2026-03-31T20:32:56.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode= 2026-03-31T20:32:56.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' '' -a '!=' ']' 2026-03-31T20:32:56.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'not change the size' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:32:56.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2302: test_mon_osd_pool_set: set -e 2026-03-31T20:32:56.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2303: test_mon_osd_pool_set: ceph osd pool get pool_erasure erasure_code_profile 2026-03-31T20:32:56.736 INFO:tasks.workunit.client.0.vm03.stdout:erasure_code_profile: default 2026-03-31T20:32:56.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2304: test_mon_osd_pool_set: ceph osd pool rm pool_erasure pool_erasure --yes-i-really-really-mean-it 2026-03-31T20:32:57.317 INFO:tasks.workunit.client.0.vm03.stderr:pool 'pool_erasure' does not exist 2026-03-31T20:32:57.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2306: test_mon_osd_pool_set: for flag in nodelete nopgchange nosizechange write_fadvise_dontneed noscrub nodeep-scrub bulk 2026-03-31T20:32:57.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2307: test_mon_osd_pool_set: ceph osd pool set pool_getset nodelete false 2026-03-31T20:32:59.279 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 nodelete to false 2026-03-31T20:32:59.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2308: test_mon_osd_pool_set: ceph osd pool get pool_getset nodelete 2026-03-31T20:32:59.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2308: test_mon_osd_pool_set: grep 'nodelete: false' 2026-03-31T20:32:59.493 INFO:tasks.workunit.client.0.vm03.stdout:nodelete: false 2026-03-31T20:32:59.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2309: test_mon_osd_pool_set: ceph osd pool set pool_getset nodelete true 2026-03-31T20:33:01.295 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 nodelete to true 2026-03-31T20:33:01.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2310: test_mon_osd_pool_set: ceph osd pool get pool_getset nodelete 2026-03-31T20:33:01.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2310: test_mon_osd_pool_set: grep 'nodelete: true' 2026-03-31T20:33:01.508 INFO:tasks.workunit.client.0.vm03.stdout:nodelete: true 2026-03-31T20:33:01.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2311: test_mon_osd_pool_set: ceph osd pool set pool_getset nodelete 1 2026-03-31T20:33:03.306 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 nodelete to 1 2026-03-31T20:33:03.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2312: test_mon_osd_pool_set: ceph osd pool get pool_getset nodelete 2026-03-31T20:33:03.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2312: test_mon_osd_pool_set: grep 'nodelete: true' 2026-03-31T20:33:03.523 INFO:tasks.workunit.client.0.vm03.stdout:nodelete: true 2026-03-31T20:33:03.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2313: test_mon_osd_pool_set: ceph osd pool set pool_getset nodelete 0 2026-03-31T20:33:05.324 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 nodelete to 0 2026-03-31T20:33:05.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2314: test_mon_osd_pool_set: ceph osd pool get pool_getset nodelete 2026-03-31T20:33:05.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2314: test_mon_osd_pool_set: grep 'nodelete: false' 2026-03-31T20:33:05.537 INFO:tasks.workunit.client.0.vm03.stdout:nodelete: false 2026-03-31T20:33:05.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2315: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset nodelete asdf 2026-03-31T20:33:05.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:33:05.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset nodelete asdf 2026-03-31T20:33:05.683 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: expecting value 'true', 'false', '0', or '1' 2026-03-31T20:33:05.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:33:05.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2316: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset nodelete 2 2026-03-31T20:33:05.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:33:05.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset nodelete 2 2026-03-31T20:33:05.837 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: expecting value 'true', 'false', '0', or '1' 2026-03-31T20:33:05.840 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:33:05.840 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2306: test_mon_osd_pool_set: for flag in nodelete nopgchange nosizechange write_fadvise_dontneed noscrub nodeep-scrub bulk 2026-03-31T20:33:05.840 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2307: test_mon_osd_pool_set: ceph osd pool set pool_getset nopgchange false 2026-03-31T20:33:07.336 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 nopgchange to false 2026-03-31T20:33:07.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2308: test_mon_osd_pool_set: ceph osd pool get pool_getset nopgchange 2026-03-31T20:33:07.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2308: test_mon_osd_pool_set: grep 'nopgchange: false' 2026-03-31T20:33:07.554 INFO:tasks.workunit.client.0.vm03.stdout:nopgchange: false 2026-03-31T20:33:07.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2309: test_mon_osd_pool_set: ceph osd pool set pool_getset nopgchange true 2026-03-31T20:33:09.357 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 nopgchange to true 2026-03-31T20:33:09.370 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2310: test_mon_osd_pool_set: ceph osd pool get pool_getset nopgchange 2026-03-31T20:33:09.370 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2310: test_mon_osd_pool_set: grep 'nopgchange: true' 2026-03-31T20:33:09.580 INFO:tasks.workunit.client.0.vm03.stdout:nopgchange: true 2026-03-31T20:33:09.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2311: test_mon_osd_pool_set: ceph osd pool set pool_getset nopgchange 1 2026-03-31T20:33:11.383 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 nopgchange to 1 2026-03-31T20:33:11.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2312: test_mon_osd_pool_set: ceph osd pool get pool_getset nopgchange 2026-03-31T20:33:11.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2312: test_mon_osd_pool_set: grep 'nopgchange: true' 2026-03-31T20:33:11.619 INFO:tasks.workunit.client.0.vm03.stdout:nopgchange: true 2026-03-31T20:33:11.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2313: test_mon_osd_pool_set: ceph osd pool set pool_getset nopgchange 0 2026-03-31T20:33:13.398 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 nopgchange to 0 2026-03-31T20:33:13.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2314: test_mon_osd_pool_set: ceph osd pool get pool_getset nopgchange 2026-03-31T20:33:13.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2314: test_mon_osd_pool_set: grep 'nopgchange: false' 2026-03-31T20:33:13.616 INFO:tasks.workunit.client.0.vm03.stdout:nopgchange: false 2026-03-31T20:33:13.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2315: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset nopgchange asdf 2026-03-31T20:33:13.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:33:13.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset nopgchange asdf 2026-03-31T20:33:13.761 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: expecting value 'true', 'false', '0', or '1' 2026-03-31T20:33:13.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:33:13.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2316: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset nopgchange 2 2026-03-31T20:33:13.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:33:13.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset nopgchange 2 2026-03-31T20:33:13.909 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: expecting value 'true', 'false', '0', or '1' 2026-03-31T20:33:13.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:33:13.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2306: test_mon_osd_pool_set: for flag in nodelete nopgchange nosizechange write_fadvise_dontneed noscrub nodeep-scrub bulk 2026-03-31T20:33:13.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2307: test_mon_osd_pool_set: ceph osd pool set pool_getset nosizechange false 2026-03-31T20:33:15.413 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 nosizechange to false 2026-03-31T20:33:15.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2308: test_mon_osd_pool_set: ceph osd pool get pool_getset nosizechange 2026-03-31T20:33:15.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2308: test_mon_osd_pool_set: grep 'nosizechange: false' 2026-03-31T20:33:15.635 INFO:tasks.workunit.client.0.vm03.stdout:nosizechange: false 2026-03-31T20:33:15.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2309: test_mon_osd_pool_set: ceph osd pool set pool_getset nosizechange true 2026-03-31T20:33:17.422 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 nosizechange to true 2026-03-31T20:33:17.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2310: test_mon_osd_pool_set: ceph osd pool get pool_getset nosizechange 2026-03-31T20:33:17.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2310: test_mon_osd_pool_set: grep 'nosizechange: true' 2026-03-31T20:33:17.638 INFO:tasks.workunit.client.0.vm03.stdout:nosizechange: true 2026-03-31T20:33:17.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2311: test_mon_osd_pool_set: ceph osd pool set pool_getset nosizechange 1 2026-03-31T20:33:19.441 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 nosizechange to 1 2026-03-31T20:33:19.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2312: test_mon_osd_pool_set: ceph osd pool get pool_getset nosizechange 2026-03-31T20:33:19.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2312: test_mon_osd_pool_set: grep 'nosizechange: true' 2026-03-31T20:33:19.661 INFO:tasks.workunit.client.0.vm03.stdout:nosizechange: true 2026-03-31T20:33:19.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2313: test_mon_osd_pool_set: ceph osd pool set pool_getset nosizechange 0 2026-03-31T20:33:21.456 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 nosizechange to 0 2026-03-31T20:33:21.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2314: test_mon_osd_pool_set: ceph osd pool get pool_getset nosizechange 2026-03-31T20:33:21.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2314: test_mon_osd_pool_set: grep 'nosizechange: false' 2026-03-31T20:33:21.673 INFO:tasks.workunit.client.0.vm03.stdout:nosizechange: false 2026-03-31T20:33:21.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2315: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset nosizechange asdf 2026-03-31T20:33:21.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:33:21.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset nosizechange asdf 2026-03-31T20:33:21.818 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: expecting value 'true', 'false', '0', or '1' 2026-03-31T20:33:21.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:33:21.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2316: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset nosizechange 2 2026-03-31T20:33:21.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:33:21.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset nosizechange 2 2026-03-31T20:33:21.968 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: expecting value 'true', 'false', '0', or '1' 2026-03-31T20:33:21.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:33:21.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2306: test_mon_osd_pool_set: for flag in nodelete nopgchange nosizechange write_fadvise_dontneed noscrub nodeep-scrub bulk 2026-03-31T20:33:21.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2307: test_mon_osd_pool_set: ceph osd pool set pool_getset write_fadvise_dontneed false 2026-03-31T20:33:23.465 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 write_fadvise_dontneed to false 2026-03-31T20:33:23.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2308: test_mon_osd_pool_set: ceph osd pool get pool_getset write_fadvise_dontneed 2026-03-31T20:33:23.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2308: test_mon_osd_pool_set: grep 'write_fadvise_dontneed: false' 2026-03-31T20:33:23.697 INFO:tasks.workunit.client.0.vm03.stdout:write_fadvise_dontneed: false 2026-03-31T20:33:23.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2309: test_mon_osd_pool_set: ceph osd pool set pool_getset write_fadvise_dontneed true 2026-03-31T20:33:25.483 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 write_fadvise_dontneed to true 2026-03-31T20:33:25.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2310: test_mon_osd_pool_set: ceph osd pool get pool_getset write_fadvise_dontneed 2026-03-31T20:33:25.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2310: test_mon_osd_pool_set: grep 'write_fadvise_dontneed: true' 2026-03-31T20:33:25.700 INFO:tasks.workunit.client.0.vm03.stdout:write_fadvise_dontneed: true 2026-03-31T20:33:25.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2311: test_mon_osd_pool_set: ceph osd pool set pool_getset write_fadvise_dontneed 1 2026-03-31T20:33:27.496 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 write_fadvise_dontneed to 1 2026-03-31T20:33:27.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2312: test_mon_osd_pool_set: ceph osd pool get pool_getset write_fadvise_dontneed 2026-03-31T20:33:27.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2312: test_mon_osd_pool_set: grep 'write_fadvise_dontneed: true' 2026-03-31T20:33:27.711 INFO:tasks.workunit.client.0.vm03.stdout:write_fadvise_dontneed: true 2026-03-31T20:33:27.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2313: test_mon_osd_pool_set: ceph osd pool set pool_getset write_fadvise_dontneed 0 2026-03-31T20:33:29.513 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 write_fadvise_dontneed to 0 2026-03-31T20:33:29.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2314: test_mon_osd_pool_set: ceph osd pool get pool_getset write_fadvise_dontneed 2026-03-31T20:33:29.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2314: test_mon_osd_pool_set: grep 'write_fadvise_dontneed: false' 2026-03-31T20:33:29.727 INFO:tasks.workunit.client.0.vm03.stdout:write_fadvise_dontneed: false 2026-03-31T20:33:29.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2315: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset write_fadvise_dontneed asdf 2026-03-31T20:33:29.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:33:29.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset write_fadvise_dontneed asdf 2026-03-31T20:33:29.872 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: expecting value 'true', 'false', '0', or '1' 2026-03-31T20:33:29.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:33:29.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2316: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset write_fadvise_dontneed 2 2026-03-31T20:33:29.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:33:29.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset write_fadvise_dontneed 2 2026-03-31T20:33:30.022 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: expecting value 'true', 'false', '0', or '1' 2026-03-31T20:33:30.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:33:30.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2306: test_mon_osd_pool_set: for flag in nodelete nopgchange nosizechange write_fadvise_dontneed noscrub nodeep-scrub bulk 2026-03-31T20:33:30.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2307: test_mon_osd_pool_set: ceph osd pool set pool_getset noscrub false 2026-03-31T20:33:31.521 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 noscrub to false 2026-03-31T20:33:31.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2308: test_mon_osd_pool_set: ceph osd pool get pool_getset noscrub 2026-03-31T20:33:31.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2308: test_mon_osd_pool_set: grep 'noscrub: false' 2026-03-31T20:33:31.737 INFO:tasks.workunit.client.0.vm03.stdout:noscrub: false 2026-03-31T20:33:31.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2309: test_mon_osd_pool_set: ceph osd pool set pool_getset noscrub true 2026-03-31T20:33:33.540 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 noscrub to true 2026-03-31T20:33:33.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2310: test_mon_osd_pool_set: ceph osd pool get pool_getset noscrub 2026-03-31T20:33:33.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2310: test_mon_osd_pool_set: grep 'noscrub: true' 2026-03-31T20:33:33.763 INFO:tasks.workunit.client.0.vm03.stdout:noscrub: true 2026-03-31T20:33:33.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2311: test_mon_osd_pool_set: ceph osd pool set pool_getset noscrub 1 2026-03-31T20:33:35.556 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 noscrub to 1 2026-03-31T20:33:35.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2312: test_mon_osd_pool_set: ceph osd pool get pool_getset noscrub 2026-03-31T20:33:35.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2312: test_mon_osd_pool_set: grep 'noscrub: true' 2026-03-31T20:33:35.780 INFO:tasks.workunit.client.0.vm03.stdout:noscrub: true 2026-03-31T20:33:35.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2313: test_mon_osd_pool_set: ceph osd pool set pool_getset noscrub 0 2026-03-31T20:33:37.577 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 noscrub to 0 2026-03-31T20:33:37.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2314: test_mon_osd_pool_set: ceph osd pool get pool_getset noscrub 2026-03-31T20:33:37.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2314: test_mon_osd_pool_set: grep 'noscrub: false' 2026-03-31T20:33:37.798 INFO:tasks.workunit.client.0.vm03.stdout:noscrub: false 2026-03-31T20:33:37.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2315: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset noscrub asdf 2026-03-31T20:33:37.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:33:37.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset noscrub asdf 2026-03-31T20:33:37.946 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: expecting value 'true', 'false', '0', or '1' 2026-03-31T20:33:37.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:33:37.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2316: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset noscrub 2 2026-03-31T20:33:37.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:33:37.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset noscrub 2 2026-03-31T20:33:38.102 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: expecting value 'true', 'false', '0', or '1' 2026-03-31T20:33:38.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:33:38.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2306: test_mon_osd_pool_set: for flag in nodelete nopgchange nosizechange write_fadvise_dontneed noscrub nodeep-scrub bulk 2026-03-31T20:33:38.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2307: test_mon_osd_pool_set: ceph osd pool set pool_getset nodeep-scrub false 2026-03-31T20:33:39.594 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 nodeep-scrub to false 2026-03-31T20:33:39.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2308: test_mon_osd_pool_set: ceph osd pool get pool_getset nodeep-scrub 2026-03-31T20:33:39.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2308: test_mon_osd_pool_set: grep 'nodeep-scrub: false' 2026-03-31T20:33:39.810 INFO:tasks.workunit.client.0.vm03.stdout:nodeep-scrub: false 2026-03-31T20:33:39.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2309: test_mon_osd_pool_set: ceph osd pool set pool_getset nodeep-scrub true 2026-03-31T20:33:41.609 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 nodeep-scrub to true 2026-03-31T20:33:41.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2310: test_mon_osd_pool_set: ceph osd pool get pool_getset nodeep-scrub 2026-03-31T20:33:41.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2310: test_mon_osd_pool_set: grep 'nodeep-scrub: true' 2026-03-31T20:33:41.837 INFO:tasks.workunit.client.0.vm03.stdout:nodeep-scrub: true 2026-03-31T20:33:41.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2311: test_mon_osd_pool_set: ceph osd pool set pool_getset nodeep-scrub 1 2026-03-31T20:33:43.635 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 nodeep-scrub to 1 2026-03-31T20:33:43.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2312: test_mon_osd_pool_set: ceph osd pool get pool_getset nodeep-scrub 2026-03-31T20:33:43.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2312: test_mon_osd_pool_set: grep 'nodeep-scrub: true' 2026-03-31T20:33:43.853 INFO:tasks.workunit.client.0.vm03.stdout:nodeep-scrub: true 2026-03-31T20:33:43.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2313: test_mon_osd_pool_set: ceph osd pool set pool_getset nodeep-scrub 0 2026-03-31T20:33:45.651 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 nodeep-scrub to 0 2026-03-31T20:33:45.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2314: test_mon_osd_pool_set: ceph osd pool get pool_getset nodeep-scrub 2026-03-31T20:33:45.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2314: test_mon_osd_pool_set: grep 'nodeep-scrub: false' 2026-03-31T20:33:45.880 INFO:tasks.workunit.client.0.vm03.stdout:nodeep-scrub: false 2026-03-31T20:33:45.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2315: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset nodeep-scrub asdf 2026-03-31T20:33:45.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:33:45.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset nodeep-scrub asdf 2026-03-31T20:33:46.043 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: expecting value 'true', 'false', '0', or '1' 2026-03-31T20:33:46.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:33:46.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2316: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset nodeep-scrub 2 2026-03-31T20:33:46.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:33:46.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset nodeep-scrub 2 2026-03-31T20:33:46.200 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: expecting value 'true', 'false', '0', or '1' 2026-03-31T20:33:46.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:33:46.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2306: test_mon_osd_pool_set: for flag in nodelete nopgchange nosizechange write_fadvise_dontneed noscrub nodeep-scrub bulk 2026-03-31T20:33:46.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2307: test_mon_osd_pool_set: ceph osd pool set pool_getset bulk false 2026-03-31T20:33:47.664 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 bulk to false 2026-03-31T20:33:47.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2308: test_mon_osd_pool_set: ceph osd pool get pool_getset bulk 2026-03-31T20:33:47.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2308: test_mon_osd_pool_set: grep 'bulk: false' 2026-03-31T20:33:47.889 INFO:tasks.workunit.client.0.vm03.stdout:bulk: false 2026-03-31T20:33:47.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2309: test_mon_osd_pool_set: ceph osd pool set pool_getset bulk true 2026-03-31T20:33:49.683 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 bulk to true 2026-03-31T20:33:49.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2310: test_mon_osd_pool_set: ceph osd pool get pool_getset bulk 2026-03-31T20:33:49.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2310: test_mon_osd_pool_set: grep 'bulk: true' 2026-03-31T20:33:49.909 INFO:tasks.workunit.client.0.vm03.stdout:bulk: true 2026-03-31T20:33:49.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2311: test_mon_osd_pool_set: ceph osd pool set pool_getset bulk 1 2026-03-31T20:33:51.698 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 bulk to 1 2026-03-31T20:33:51.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2312: test_mon_osd_pool_set: ceph osd pool get pool_getset bulk 2026-03-31T20:33:51.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2312: test_mon_osd_pool_set: grep 'bulk: true' 2026-03-31T20:33:51.924 INFO:tasks.workunit.client.0.vm03.stdout:bulk: true 2026-03-31T20:33:51.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2313: test_mon_osd_pool_set: ceph osd pool set pool_getset bulk 0 2026-03-31T20:33:53.714 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 bulk to 0 2026-03-31T20:33:53.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2314: test_mon_osd_pool_set: ceph osd pool get pool_getset bulk 2026-03-31T20:33:53.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2314: test_mon_osd_pool_set: grep 'bulk: false' 2026-03-31T20:33:53.948 INFO:tasks.workunit.client.0.vm03.stdout:bulk: false 2026-03-31T20:33:53.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2315: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset bulk asdf 2026-03-31T20:33:53.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:33:53.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset bulk asdf 2026-03-31T20:33:54.100 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: expecting value 'true', 'false', '0', or '1' 2026-03-31T20:33:54.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:33:54.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2316: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset bulk 2 2026-03-31T20:33:54.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:33:54.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset bulk 2 2026-03-31T20:33:54.253 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: expecting value 'true', 'false', '0', or '1' 2026-03-31T20:33:54.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:33:54.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2319: test_mon_osd_pool_set: ceph osd pool get pool_getset scrub_min_interval 2026-03-31T20:33:54.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2319: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:33:54.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:33:54.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:33:54.406 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'scrub_min_interval' is not set on pool 'pool_getset' 2026-03-31T20:33:54.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:33:54.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2320: test_mon_osd_pool_set: ceph osd pool set pool_getset scrub_min_interval 123456 2026-03-31T20:33:55.724 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 scrub_min_interval to 123456 2026-03-31T20:33:55.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2321: test_mon_osd_pool_set: ceph osd pool get pool_getset scrub_min_interval 2026-03-31T20:33:55.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2321: test_mon_osd_pool_set: grep 'scrub_min_interval: 123456' 2026-03-31T20:33:55.955 INFO:tasks.workunit.client.0.vm03.stdout:scrub_min_interval: 123456 2026-03-31T20:33:55.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2322: test_mon_osd_pool_set: ceph osd pool set pool_getset scrub_min_interval 0 2026-03-31T20:33:57.743 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 scrub_min_interval to 0 2026-03-31T20:33:57.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2323: test_mon_osd_pool_set: ceph osd pool get pool_getset scrub_min_interval 2026-03-31T20:33:57.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2323: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:33:57.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:33:57.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:33:57.909 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'scrub_min_interval' is not set on pool 'pool_getset' 2026-03-31T20:33:57.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:33:57.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2325: test_mon_osd_pool_set: ceph osd pool get pool_getset scrub_max_interval 2026-03-31T20:33:57.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2325: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:33:57.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:33:57.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:33:58.065 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'scrub_max_interval' is not set on pool 'pool_getset' 2026-03-31T20:33:58.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:33:58.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2326: test_mon_osd_pool_set: ceph osd pool set pool_getset scrub_max_interval 123456 2026-03-31T20:33:59.755 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 scrub_max_interval to 123456 2026-03-31T20:33:59.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2327: test_mon_osd_pool_set: ceph osd pool get pool_getset scrub_max_interval 2026-03-31T20:33:59.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2327: test_mon_osd_pool_set: grep 'scrub_max_interval: 123456' 2026-03-31T20:33:59.987 INFO:tasks.workunit.client.0.vm03.stdout:scrub_max_interval: 123456 2026-03-31T20:33:59.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2328: test_mon_osd_pool_set: ceph osd pool set pool_getset scrub_max_interval 0 2026-03-31T20:34:01.779 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 scrub_max_interval to 0 2026-03-31T20:34:01.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2329: test_mon_osd_pool_set: ceph osd pool get pool_getset scrub_max_interval 2026-03-31T20:34:01.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2329: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:34:01.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:34:01.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:34:01.948 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'scrub_max_interval' is not set on pool 'pool_getset' 2026-03-31T20:34:01.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:34:01.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2331: test_mon_osd_pool_set: ceph osd pool get pool_getset deep_scrub_interval 2026-03-31T20:34:01.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2331: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:34:01.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:34:01.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:34:02.112 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'deep_scrub_interval' is not set on pool 'pool_getset' 2026-03-31T20:34:02.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:34:02.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2332: test_mon_osd_pool_set: ceph osd pool set pool_getset deep_scrub_interval 123456 2026-03-31T20:34:03.794 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 deep_scrub_interval to 123456 2026-03-31T20:34:03.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2333: test_mon_osd_pool_set: ceph osd pool get pool_getset deep_scrub_interval 2026-03-31T20:34:03.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2333: test_mon_osd_pool_set: grep 'deep_scrub_interval: 123456' 2026-03-31T20:34:04.027 INFO:tasks.workunit.client.0.vm03.stdout:deep_scrub_interval: 123456 2026-03-31T20:34:04.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2334: test_mon_osd_pool_set: ceph osd pool set pool_getset deep_scrub_interval 0 2026-03-31T20:34:05.809 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 deep_scrub_interval to 0 2026-03-31T20:34:05.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2335: test_mon_osd_pool_set: ceph osd pool get pool_getset deep_scrub_interval 2026-03-31T20:34:05.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2335: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:34:05.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:34:05.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:34:05.978 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'deep_scrub_interval' is not set on pool 'pool_getset' 2026-03-31T20:34:05.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:34:05.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2337: test_mon_osd_pool_set: ceph osd pool get pool_getset recovery_priority 2026-03-31T20:34:05.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2337: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:34:05.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:34:05.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:34:06.133 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'recovery_priority' is not set on pool 'pool_getset' 2026-03-31T20:34:06.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:34:06.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2338: test_mon_osd_pool_set: ceph osd pool set pool_getset recovery_priority 5 2026-03-31T20:34:07.828 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 recovery_priority to 5 2026-03-31T20:34:07.840 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2339: test_mon_osd_pool_set: ceph osd pool get pool_getset recovery_priority 2026-03-31T20:34:07.840 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2339: test_mon_osd_pool_set: grep 'recovery_priority: 5' 2026-03-31T20:34:08.067 INFO:tasks.workunit.client.0.vm03.stdout:recovery_priority: 5 2026-03-31T20:34:08.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2340: test_mon_osd_pool_set: ceph osd pool set pool_getset recovery_priority -5 2026-03-31T20:34:09.841 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 recovery_priority to -5 2026-03-31T20:34:09.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2341: test_mon_osd_pool_set: ceph osd pool get pool_getset recovery_priority 2026-03-31T20:34:09.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2341: test_mon_osd_pool_set: grep 'recovery_priority: -5' 2026-03-31T20:34:10.079 INFO:tasks.workunit.client.0.vm03.stdout:recovery_priority: -5 2026-03-31T20:34:10.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2342: test_mon_osd_pool_set: ceph osd pool set pool_getset recovery_priority 0 2026-03-31T20:34:11.855 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 recovery_priority to 0 2026-03-31T20:34:11.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2343: test_mon_osd_pool_set: ceph osd pool get pool_getset recovery_priority 2026-03-31T20:34:11.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2343: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:34:11.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:34:11.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:34:12.017 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'recovery_priority' is not set on pool 'pool_getset' 2026-03-31T20:34:12.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:34:12.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2344: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset recovery_priority -11 2026-03-31T20:34:12.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:34:12.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset recovery_priority -11 2026-03-31T20:34:12.175 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: pool recovery_priority must be between -10 and 10 2026-03-31T20:34:12.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:34:12.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2345: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset recovery_priority 11 2026-03-31T20:34:12.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:34:12.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset recovery_priority 11 2026-03-31T20:34:12.334 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: pool recovery_priority must be between -10 and 10 2026-03-31T20:34:12.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:34:12.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2347: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:34:12.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2347: test_mon_osd_pool_set: ceph osd pool get pool_getset recovery_op_priority 2026-03-31T20:34:12.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:34:12.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:34:12.491 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'recovery_op_priority' is not set on pool 'pool_getset' 2026-03-31T20:34:12.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:34:12.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2348: test_mon_osd_pool_set: ceph osd pool set pool_getset recovery_op_priority 5 2026-03-31T20:34:13.876 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 recovery_op_priority to 5 2026-03-31T20:34:13.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2349: test_mon_osd_pool_set: ceph osd pool get pool_getset recovery_op_priority 2026-03-31T20:34:13.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2349: test_mon_osd_pool_set: grep 'recovery_op_priority: 5' 2026-03-31T20:34:14.111 INFO:tasks.workunit.client.0.vm03.stdout:recovery_op_priority: 5 2026-03-31T20:34:14.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2350: test_mon_osd_pool_set: ceph osd pool set pool_getset recovery_op_priority 0 2026-03-31T20:34:15.911 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 recovery_op_priority to 0 2026-03-31T20:34:15.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2351: test_mon_osd_pool_set: ceph osd pool get pool_getset recovery_op_priority 2026-03-31T20:34:15.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2351: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:34:15.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:34:15.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:34:16.079 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'recovery_op_priority' is not set on pool 'pool_getset' 2026-03-31T20:34:16.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:34:16.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2353: test_mon_osd_pool_set: ceph osd pool get pool_getset scrub_priority 2026-03-31T20:34:16.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2353: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:34:16.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:34:16.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:34:16.249 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'scrub_priority' is not set on pool 'pool_getset' 2026-03-31T20:34:16.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:34:16.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2354: test_mon_osd_pool_set: ceph osd pool set pool_getset scrub_priority 5 2026-03-31T20:34:17.927 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 scrub_priority to 5 2026-03-31T20:34:17.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2355: test_mon_osd_pool_set: ceph osd pool get pool_getset scrub_priority 2026-03-31T20:34:17.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2355: test_mon_osd_pool_set: grep 'scrub_priority: 5' 2026-03-31T20:34:18.164 INFO:tasks.workunit.client.0.vm03.stdout:scrub_priority: 5 2026-03-31T20:34:18.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2356: test_mon_osd_pool_set: ceph osd pool set pool_getset scrub_priority 0 2026-03-31T20:34:19.943 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 scrub_priority to 0 2026-03-31T20:34:19.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2357: test_mon_osd_pool_set: ceph osd pool get pool_getset scrub_priority 2026-03-31T20:34:19.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2357: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:34:19.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:34:19.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:34:20.114 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'scrub_priority' is not set on pool 'pool_getset' 2026-03-31T20:34:20.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:34:20.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2359: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset target_size_ratio -3 2026-03-31T20:34:20.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:34:20.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset target_size_ratio -3 2026-03-31T20:34:20.288 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: target_size_ratio cannot be negative 2026-03-31T20:34:20.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:34:20.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2360: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset target_size_ratio abc 2026-03-31T20:34:20.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:34:20.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset target_size_ratio abc 2026-03-31T20:34:20.444 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: error parsing floating point value 'abc': strict_strtod: expected double, got: 'abc' 2026-03-31T20:34:20.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:34:20.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2361: test_mon_osd_pool_set: expect_true ceph osd pool set pool_getset target_size_ratio 0.1 2026-03-31T20:34:20.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:34:20.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: ceph osd pool set pool_getset target_size_ratio 0.1 2026-03-31T20:34:21.992 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 target_size_ratio to 0.1 2026-03-31T20:34:22.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:34:22.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2362: test_mon_osd_pool_set: expect_true ceph osd pool set pool_getset target_size_ratio 1 2026-03-31T20:34:22.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:34:22.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: ceph osd pool set pool_getset target_size_ratio 1 2026-03-31T20:34:24.053 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 target_size_ratio to 1 2026-03-31T20:34:24.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:34:24.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2363: test_mon_osd_pool_set: ceph osd pool get pool_getset target_size_ratio 2026-03-31T20:34:24.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2363: test_mon_osd_pool_set: grep 'target_size_ratio: 1' 2026-03-31T20:34:24.296 INFO:tasks.workunit.client.0.vm03.stdout:target_size_ratio: 1 2026-03-31T20:34:24.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2365: test_mon_osd_pool_set: ceph osd pool set pool_getset nopgchange 1 2026-03-31T20:34:26.066 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 nopgchange to 1 2026-03-31T20:34:26.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2366: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset pg_num 10 2026-03-31T20:34:26.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:34:26.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset pg_num 10 2026-03-31T20:34:26.240 INFO:tasks.workunit.client.0.vm03.stderr:Error EPERM: pool pg_num change is disabled; you must unset nopgchange flag for the pool first 2026-03-31T20:34:26.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:34:26.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2367: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset pgp_num 10 2026-03-31T20:34:26.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:34:26.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset pgp_num 10 2026-03-31T20:34:26.401 INFO:tasks.workunit.client.0.vm03.stderr:Error EPERM: pool pgp_num change is disabled; you must unset nopgchange flag for the pool first 2026-03-31T20:34:26.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:34:26.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2368: test_mon_osd_pool_set: ceph osd pool set pool_getset nopgchange 0 2026-03-31T20:34:28.085 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 nopgchange to 0 2026-03-31T20:34:28.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2369: test_mon_osd_pool_set: ceph osd pool set pool_getset pg_num 10 2026-03-31T20:34:30.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2370: test_mon_osd_pool_set: wait_for_clean 2026-03-31T20:34:30.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1652: wait_for_clean: local cmd= 2026-03-31T20:34:30.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1653: wait_for_clean: local num_active_clean=-1 2026-03-31T20:34:30.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1654: wait_for_clean: local cur_active_clean 2026-03-31T20:34:30.182 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1655: wait_for_clean: get_timeout_delays 90 .1 2026-03-31T20:34:30.182 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: shopt -q -o xtrace 2026-03-31T20:34:30.182 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: echo true 2026-03-31T20:34:30.182 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: local trace=true 2026-03-31T20:34:30.182 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: true 2026-03-31T20:34:30.182 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: shopt -u -o xtrace 2026-03-31T20:34:30.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1655: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-31T20:34:30.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1655: wait_for_clean: local -a delays 2026-03-31T20:34:30.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1656: wait_for_clean: local -i loop=0 2026-03-31T20:34:30.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1658: wait_for_clean: flush_pg_stats 2026-03-31T20:34:30.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2258: flush_pg_stats: local timeout=300 2026-03-31T20:34:30.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2260: flush_pg_stats: ceph osd ls 2026-03-31T20:34:30.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2260: flush_pg_stats: ids='0 2026-03-31T20:34:30.477 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-31T20:34:30.477 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-31T20:34:30.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2261: flush_pg_stats: seqs= 2026-03-31T20:34:30.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:34:30.477 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.0 flush_pg_stats 2026-03-31T20:34:30.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=949187772586 2026-03-31T20:34:30.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 949187772586 2026-03-31T20:34:30.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-949187772586' 2026-03-31T20:34:30.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:34:30.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.1 flush_pg_stats 2026-03-31T20:34:30.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=34359738538 2026-03-31T20:34:30.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 34359738538 2026-03-31T20:34:30.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-949187772586 1-34359738538' 2026-03-31T20:34:30.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:34:30.651 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.2 flush_pg_stats 2026-03-31T20:34:30.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=34359738538 2026-03-31T20:34:30.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 34359738538 2026-03-31T20:34:30.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-949187772586 1-34359738538 2-34359738538' 2026-03-31T20:34:30.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:34:30.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 0-949187772586 2026-03-31T20:34:30.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:34:30.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=0 2026-03-31T20:34:30.737 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 0-949187772586 2026-03-31T20:34:30.737 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:34:30.739 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 949187772586 2026-03-31T20:34:30.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=949187772586 2026-03-31T20:34:30.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.0 seq 949187772586' 2026-03-31T20:34:30.739 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-31T20:34:30.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 949187772584 -lt 949187772586 2026-03-31T20:34:30.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2275: flush_pg_stats: sleep 1 2026-03-31T20:34:31.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2276: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-31T20:34:31.954 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-31T20:34:32.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 949187772584 -lt 949187772586 2026-03-31T20:34:32.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2275: flush_pg_stats: sleep 1 2026-03-31T20:34:33.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2276: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-31T20:34:33.185 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-31T20:34:33.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 949187772586 -lt 949187772586 2026-03-31T20:34:33.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:34:33.404 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 1-34359738538 2026-03-31T20:34:33.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:34:33.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=1 2026-03-31T20:34:33.407 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 1-34359738538 2026-03-31T20:34:33.407 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:34:33.408 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 34359738538 2026-03-31T20:34:33.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=34359738538 2026-03-31T20:34:33.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.1 seq 34359738538' 2026-03-31T20:34:33.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-31T20:34:33.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738538 -lt 34359738538 2026-03-31T20:34:33.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:34:33.634 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 2-34359738538 2026-03-31T20:34:33.634 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:34:33.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=2 2026-03-31T20:34:33.636 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 2-34359738538 2026-03-31T20:34:33.636 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:34:33.638 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 34359738538 2026-03-31T20:34:33.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=34359738538 2026-03-31T20:34:33.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.2 seq 34359738538' 2026-03-31T20:34:33.638 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-31T20:34:33.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738538 -lt 34359738538 2026-03-31T20:34:33.868 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1659: wait_for_clean: get_num_pgs 2026-03-31T20:34:33.868 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: ceph --format json status 2026-03-31T20:34:33.868 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: jq .pgmap.num_pgs 2026-03-31T20:34:34.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1659: wait_for_clean: test 19 == 0 2026-03-31T20:34:34.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1663: wait_for_clean: true 2026-03-31T20:34:34.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1667: wait_for_clean: get_num_active_clean 2026-03-31T20:34:34.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1360: get_num_active_clean: local expression 2026-03-31T20:34:34.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1361: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-31T20:34:34.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1362: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-31T20:34:34.161 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1363: get_num_active_clean: ceph --format json pg dump pgs 2026-03-31T20:34:34.161 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1364: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-31T20:34:34.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1667: wait_for_clean: cur_active_clean=9 2026-03-31T20:34:34.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1668: wait_for_clean: get_num_pgs 2026-03-31T20:34:34.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: ceph --format json status 2026-03-31T20:34:34.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: jq .pgmap.num_pgs 2026-03-31T20:34:34.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1668: wait_for_clean: test 9 = 19 2026-03-31T20:34:34.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1669: wait_for_clean: test 9 '!=' -1 2026-03-31T20:34:34.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1670: wait_for_clean: loop=0 2026-03-31T20:34:34.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1671: wait_for_clean: num_active_clean=9 2026-03-31T20:34:34.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1679: wait_for_clean: eval 2026-03-31T20:34:34.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1680: wait_for_clean: sleep 0.1 2026-03-31T20:34:34.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1681: wait_for_clean: loop+=1 2026-03-31T20:34:34.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1663: wait_for_clean: true 2026-03-31T20:34:34.763 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1667: wait_for_clean: get_num_active_clean 2026-03-31T20:34:34.764 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1360: get_num_active_clean: local expression 2026-03-31T20:34:34.764 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1361: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-31T20:34:34.764 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1362: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-31T20:34:34.764 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1363: get_num_active_clean: ceph --format json pg dump pgs 2026-03-31T20:34:34.764 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1364: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-31T20:34:34.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1667: wait_for_clean: cur_active_clean=9 2026-03-31T20:34:34.983 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1668: wait_for_clean: get_num_pgs 2026-03-31T20:34:34.983 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: ceph --format json status 2026-03-31T20:34:34.983 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: jq .pgmap.num_pgs 2026-03-31T20:34:35.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1668: wait_for_clean: test 9 = 19 2026-03-31T20:34:35.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1669: wait_for_clean: test 9 '!=' 9 2026-03-31T20:34:35.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1672: wait_for_clean: get_is_making_recovery_progress 2026-03-31T20:34:35.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1330: get_is_making_recovery_progress: local recovery_progress 2026-03-31T20:34:35.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1331: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-31T20:34:35.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1332: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-31T20:34:35.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1333: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-31T20:34:35.258 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: ceph --format json status 2026-03-31T20:34:35.258 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-31T20:34:35.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: local progress=null 2026-03-31T20:34:35.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: test null '!=' null 2026-03-31T20:34:35.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1674: wait_for_clean: (( 1 >= 13 )) 2026-03-31T20:34:35.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1679: wait_for_clean: eval 2026-03-31T20:34:35.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1680: wait_for_clean: sleep 0.2 2026-03-31T20:34:35.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1681: wait_for_clean: loop+=1 2026-03-31T20:34:35.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1663: wait_for_clean: true 2026-03-31T20:34:35.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1667: wait_for_clean: get_num_active_clean 2026-03-31T20:34:35.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1360: get_num_active_clean: local expression 2026-03-31T20:34:35.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1361: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-31T20:34:35.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1362: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-31T20:34:35.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1363: get_num_active_clean: ceph --format json pg dump pgs 2026-03-31T20:34:35.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1364: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-31T20:34:35.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1667: wait_for_clean: cur_active_clean=19 2026-03-31T20:34:35.973 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1668: wait_for_clean: get_num_pgs 2026-03-31T20:34:35.974 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: ceph --format json status 2026-03-31T20:34:35.974 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: jq .pgmap.num_pgs 2026-03-31T20:34:36.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1668: wait_for_clean: test 19 = 19 2026-03-31T20:34:36.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1668: wait_for_clean: break 2026-03-31T20:34:36.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1683: wait_for_clean: return 0 2026-03-31T20:34:36.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2371: test_mon_osd_pool_set: ceph osd pool set pool_getset pgp_num 10 2026-03-31T20:34:38.441 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 pgp_num to 10 2026-03-31T20:34:38.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2372: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset pg_num 0 2026-03-31T20:34:38.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:34:38.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset pg_num 0 2026-03-31T20:34:38.616 INFO:tasks.workunit.client.0.vm03.stderr:Error ERANGE: 'pg_num' must be greater than 0 and less than or equal to 65536 (you may adjust 'mon max pool pg num' for higher values) 2026-03-31T20:34:38.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:34:38.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2373: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset pgp_num 0 2026-03-31T20:34:38.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:34:38.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset pgp_num 0 2026-03-31T20:34:38.775 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: specified pgp_num must > 0, but you set to 0 2026-03-31T20:34:38.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:34:38.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2375: test_mon_osd_pool_set: ceph osd pool get pool_getset pg_num 2026-03-31T20:34:38.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2375: test_mon_osd_pool_set: sed -e 's/pg_num: //' 2026-03-31T20:34:39.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2375: test_mon_osd_pool_set: old_pgs=10 2026-03-31T20:34:39.026 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2376: test_mon_osd_pool_set: ceph osd stat --format json 2026-03-31T20:34:39.026 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2376: test_mon_osd_pool_set: jq .num_osds 2026-03-31T20:34:39.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2376: test_mon_osd_pool_set: new_pgs=106 2026-03-31T20:34:39.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2377: test_mon_osd_pool_set: ceph osd pool set pool_getset pg_num 106 2026-03-31T20:34:40.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2378: test_mon_osd_pool_set: ceph osd pool set pool_getset pgp_num 106 2026-03-31T20:34:42.486 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 pgp_num to 106 2026-03-31T20:34:42.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2379: test_mon_osd_pool_set: wait_for_clean 2026-03-31T20:34:42.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1652: wait_for_clean: local cmd= 2026-03-31T20:34:42.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1653: wait_for_clean: local num_active_clean=-1 2026-03-31T20:34:42.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1654: wait_for_clean: local cur_active_clean 2026-03-31T20:34:42.506 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1655: wait_for_clean: get_timeout_delays 90 .1 2026-03-31T20:34:42.506 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: shopt -q -o xtrace 2026-03-31T20:34:42.506 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: echo true 2026-03-31T20:34:42.507 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: local trace=true 2026-03-31T20:34:42.507 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: true 2026-03-31T20:34:42.507 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: shopt -u -o xtrace 2026-03-31T20:34:42.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1655: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-31T20:34:42.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1655: wait_for_clean: local -a delays 2026-03-31T20:34:42.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1656: wait_for_clean: local -i loop=0 2026-03-31T20:34:42.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1658: wait_for_clean: flush_pg_stats 2026-03-31T20:34:42.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2258: flush_pg_stats: local timeout=300 2026-03-31T20:34:42.581 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2260: flush_pg_stats: ceph osd ls 2026-03-31T20:34:42.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2260: flush_pg_stats: ids='0 2026-03-31T20:34:42.796 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-31T20:34:42.796 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-31T20:34:42.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2261: flush_pg_stats: seqs= 2026-03-31T20:34:42.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:34:42.797 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.0 flush_pg_stats 2026-03-31T20:34:42.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=949187772590 2026-03-31T20:34:42.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 949187772590 2026-03-31T20:34:42.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-949187772590' 2026-03-31T20:34:42.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:34:42.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.1 flush_pg_stats 2026-03-31T20:34:42.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=34359738542 2026-03-31T20:34:42.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 34359738542 2026-03-31T20:34:42.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-949187772590 1-34359738542' 2026-03-31T20:34:42.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:34:42.978 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.2 flush_pg_stats 2026-03-31T20:34:43.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=34359738542 2026-03-31T20:34:43.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 34359738542 2026-03-31T20:34:43.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-949187772590 1-34359738542 2-34359738542' 2026-03-31T20:34:43.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:34:43.072 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 0-949187772590 2026-03-31T20:34:43.072 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:34:43.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=0 2026-03-31T20:34:43.074 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 0-949187772590 2026-03-31T20:34:43.074 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:34:43.076 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 949187772590 2026-03-31T20:34:43.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=949187772590 2026-03-31T20:34:43.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.0 seq 949187772590' 2026-03-31T20:34:43.076 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-31T20:34:43.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 949187772588 -lt 949187772590 2026-03-31T20:34:43.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2275: flush_pg_stats: sleep 1 2026-03-31T20:34:44.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2276: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-31T20:34:44.307 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-31T20:34:44.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 949187772590 -lt 949187772590 2026-03-31T20:34:44.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:34:44.545 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 1-34359738542 2026-03-31T20:34:44.545 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:34:44.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=1 2026-03-31T20:34:44.547 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 1-34359738542 2026-03-31T20:34:44.547 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:34:44.548 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 34359738542 2026-03-31T20:34:44.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=34359738542 2026-03-31T20:34:44.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.1 seq 34359738542' 2026-03-31T20:34:44.548 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-31T20:34:44.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738542 -lt 34359738542 2026-03-31T20:34:44.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:34:44.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 2-34359738542 2026-03-31T20:34:44.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:34:44.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=2 2026-03-31T20:34:44.767 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 2-34359738542 2026-03-31T20:34:44.767 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:34:44.768 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 34359738542 2026-03-31T20:34:44.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=34359738542 2026-03-31T20:34:44.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.2 seq 34359738542' 2026-03-31T20:34:44.768 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-31T20:34:44.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738542 -lt 34359738542 2026-03-31T20:34:44.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1659: wait_for_clean: get_num_pgs 2026-03-31T20:34:44.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: ceph --format json status 2026-03-31T20:34:44.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: jq .pgmap.num_pgs 2026-03-31T20:34:45.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1659: wait_for_clean: test 115 == 0 2026-03-31T20:34:45.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1663: wait_for_clean: true 2026-03-31T20:34:45.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1667: wait_for_clean: get_num_active_clean 2026-03-31T20:34:45.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1360: get_num_active_clean: local expression 2026-03-31T20:34:45.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1361: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-31T20:34:45.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1362: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-31T20:34:45.272 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1363: get_num_active_clean: ceph --format json pg dump pgs 2026-03-31T20:34:45.272 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1364: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-31T20:34:45.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1667: wait_for_clean: cur_active_clean=115 2026-03-31T20:34:45.516 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1668: wait_for_clean: get_num_pgs 2026-03-31T20:34:45.517 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: jq .pgmap.num_pgs 2026-03-31T20:34:45.523 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: ceph --format json status 2026-03-31T20:34:45.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1668: wait_for_clean: test 115 = 115 2026-03-31T20:34:45.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1668: wait_for_clean: break 2026-03-31T20:34:45.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1683: wait_for_clean: return 0 2026-03-31T20:34:45.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2381: test_mon_osd_pool_set: ceph osd pool set pool_getset nosizechange 1 2026-03-31T20:34:47.529 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 nosizechange to 1 2026-03-31T20:34:47.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2382: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset size 2 2026-03-31T20:34:47.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:34:47.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset size 2 2026-03-31T20:34:47.692 INFO:tasks.workunit.client.0.vm03.stderr:Error EPERM: pool size change is disabled; you must unset nosizechange flag for the pool first 2026-03-31T20:34:47.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:34:47.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2383: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset min_size 2 2026-03-31T20:34:47.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:34:47.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset min_size 2 2026-03-31T20:34:47.850 INFO:tasks.workunit.client.0.vm03.stderr:Error EPERM: pool min size change is disabled; you must unset nosizechange flag for the pool first 2026-03-31T20:34:47.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:34:47.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2384: test_mon_osd_pool_set: ceph osd pool set pool_getset nosizechange 0 2026-03-31T20:34:49.535 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 nosizechange to 0 2026-03-31T20:34:49.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2385: test_mon_osd_pool_set: ceph osd pool set pool_getset size 2 2026-03-31T20:34:51.554 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 size to 2 2026-03-31T20:34:51.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2386: test_mon_osd_pool_set: wait_for_clean 2026-03-31T20:34:51.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1652: wait_for_clean: local cmd= 2026-03-31T20:34:51.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1653: wait_for_clean: local num_active_clean=-1 2026-03-31T20:34:51.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1654: wait_for_clean: local cur_active_clean 2026-03-31T20:34:51.568 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1655: wait_for_clean: get_timeout_delays 90 .1 2026-03-31T20:34:51.568 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: shopt -q -o xtrace 2026-03-31T20:34:51.569 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: echo true 2026-03-31T20:34:51.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: local trace=true 2026-03-31T20:34:51.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: true 2026-03-31T20:34:51.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: shopt -u -o xtrace 2026-03-31T20:34:51.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1655: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-31T20:34:51.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1655: wait_for_clean: local -a delays 2026-03-31T20:34:51.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1656: wait_for_clean: local -i loop=0 2026-03-31T20:34:51.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1658: wait_for_clean: flush_pg_stats 2026-03-31T20:34:51.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2258: flush_pg_stats: local timeout=300 2026-03-31T20:34:51.647 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2260: flush_pg_stats: ceph osd ls 2026-03-31T20:34:51.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2260: flush_pg_stats: ids='0 2026-03-31T20:34:51.864 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-31T20:34:51.864 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-31T20:34:51.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2261: flush_pg_stats: seqs= 2026-03-31T20:34:51.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:34:51.864 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.0 flush_pg_stats 2026-03-31T20:34:51.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=949187772594 2026-03-31T20:34:51.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 949187772594 2026-03-31T20:34:51.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-949187772594' 2026-03-31T20:34:51.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:34:51.951 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.1 flush_pg_stats 2026-03-31T20:34:52.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=34359738546 2026-03-31T20:34:52.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 34359738546 2026-03-31T20:34:52.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-949187772594 1-34359738546' 2026-03-31T20:34:52.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:34:52.038 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.2 flush_pg_stats 2026-03-31T20:34:52.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=34359738546 2026-03-31T20:34:52.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 34359738546 2026-03-31T20:34:52.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-949187772594 1-34359738546 2-34359738546' 2026-03-31T20:34:52.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:34:52.126 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 0-949187772594 2026-03-31T20:34:52.126 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:34:52.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=0 2026-03-31T20:34:52.128 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 0-949187772594 2026-03-31T20:34:52.129 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:34:52.130 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 949187772594 2026-03-31T20:34:52.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=949187772594 2026-03-31T20:34:52.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.0 seq 949187772594' 2026-03-31T20:34:52.131 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-31T20:34:52.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 949187772592 -lt 949187772594 2026-03-31T20:34:52.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2275: flush_pg_stats: sleep 1 2026-03-31T20:34:53.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2276: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-31T20:34:53.347 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-31T20:34:53.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 949187772592 -lt 949187772594 2026-03-31T20:34:53.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2275: flush_pg_stats: sleep 1 2026-03-31T20:34:54.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2276: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-31T20:34:54.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-31T20:34:54.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 949187772594 -lt 949187772594 2026-03-31T20:34:54.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:34:54.778 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 1-34359738546 2026-03-31T20:34:54.778 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:34:54.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=1 2026-03-31T20:34:54.779 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 1-34359738546 2026-03-31T20:34:54.779 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:34:54.780 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 34359738546 2026-03-31T20:34:54.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=34359738546 2026-03-31T20:34:54.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.1 seq 34359738546' 2026-03-31T20:34:54.781 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-31T20:34:54.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738546 -lt 34359738546 2026-03-31T20:34:54.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:34:54.988 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 2-34359738546 2026-03-31T20:34:54.988 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:34:54.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=2 2026-03-31T20:34:54.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 2-34359738546 2026-03-31T20:34:54.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:34:54.992 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 34359738546 2026-03-31T20:34:54.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=34359738546 2026-03-31T20:34:54.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.2 seq 34359738546' 2026-03-31T20:34:54.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-31T20:34:55.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738546 -lt 34359738546 2026-03-31T20:34:55.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1659: wait_for_clean: get_num_pgs 2026-03-31T20:34:55.196 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: ceph --format json status 2026-03-31T20:34:55.196 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: jq .pgmap.num_pgs 2026-03-31T20:34:55.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1659: wait_for_clean: test 115 == 0 2026-03-31T20:34:55.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1663: wait_for_clean: true 2026-03-31T20:34:55.469 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1667: wait_for_clean: get_num_active_clean 2026-03-31T20:34:55.469 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1360: get_num_active_clean: local expression 2026-03-31T20:34:55.469 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1361: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-31T20:34:55.469 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1362: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-31T20:34:55.469 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1363: get_num_active_clean: ceph --format json pg dump pgs 2026-03-31T20:34:55.469 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1364: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-31T20:34:55.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1667: wait_for_clean: cur_active_clean=115 2026-03-31T20:34:55.691 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1668: wait_for_clean: get_num_pgs 2026-03-31T20:34:55.692 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: ceph --format json status 2026-03-31T20:34:55.692 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: jq .pgmap.num_pgs 2026-03-31T20:34:55.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1668: wait_for_clean: test 115 = 115 2026-03-31T20:34:55.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1668: wait_for_clean: break 2026-03-31T20:34:55.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1683: wait_for_clean: return 0 2026-03-31T20:34:55.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2387: test_mon_osd_pool_set: ceph osd pool set pool_getset min_size 2 2026-03-31T20:34:57.584 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 min_size to 2 2026-03-31T20:34:57.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2389: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset hashpspool 0 2026-03-31T20:34:57.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:34:57.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset hashpspool 0 2026-03-31T20:34:57.760 INFO:tasks.workunit.client.0.vm03.stderr:Error EPERM: are you SURE? this will remap all placement groups in this pool, this triggers large data movement, pass --yes-i-really-mean-it if you really do. 2026-03-31T20:34:57.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:34:57.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2390: test_mon_osd_pool_set: ceph osd pool set pool_getset hashpspool 0 --yes-i-really-mean-it 2026-03-31T20:34:59.590 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 hashpspool to 0 2026-03-31T20:34:59.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2392: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset hashpspool 1 2026-03-31T20:34:59.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:34:59.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset hashpspool 1 2026-03-31T20:34:59.785 INFO:tasks.workunit.client.0.vm03.stderr:Error EPERM: are you SURE? this will remap all placement groups in this pool, this triggers large data movement, pass --yes-i-really-mean-it if you really do. 2026-03-31T20:34:59.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:34:59.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2393: test_mon_osd_pool_set: ceph osd pool set pool_getset hashpspool 1 --yes-i-really-mean-it 2026-03-31T20:35:01.614 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 hashpspool to 1 2026-03-31T20:35:01.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2395: test_mon_osd_pool_set: ceph osd pool get rbd crush_rule 2026-03-31T20:35:01.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2395: test_mon_osd_pool_set: grep 'crush_rule: ' 2026-03-31T20:35:01.893 INFO:tasks.workunit.client.0.vm03.stdout:crush_rule: replicated_rule 2026-03-31T20:35:01.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2397: test_mon_osd_pool_set: ceph osd pool get pool_getset compression_mode 2026-03-31T20:35:01.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2397: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:35:01.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:35:01.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:35:02.044 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'compression_mode' is not set on pool 'pool_getset' 2026-03-31T20:35:02.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:35:02.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2398: test_mon_osd_pool_set: ceph osd pool set pool_getset compression_mode aggressive 2026-03-31T20:35:03.632 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 compression_mode to aggressive 2026-03-31T20:35:03.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2399: test_mon_osd_pool_set: ceph osd pool get pool_getset compression_mode 2026-03-31T20:35:03.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2399: test_mon_osd_pool_set: grep aggressive 2026-03-31T20:35:03.860 INFO:tasks.workunit.client.0.vm03.stdout:compression_mode: aggressive 2026-03-31T20:35:03.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2400: test_mon_osd_pool_set: ceph osd pool set pool_getset compression_mode unset 2026-03-31T20:35:05.640 INFO:tasks.workunit.client.0.vm03.stderr:unset pool 35 compression_mode 2026-03-31T20:35:05.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2401: test_mon_osd_pool_set: ceph osd pool get pool_getset compression_mode 2026-03-31T20:35:05.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2401: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:35:05.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:35:05.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:35:05.808 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'compression_mode' is not set on pool 'pool_getset' 2026-03-31T20:35:05.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:35:05.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2403: test_mon_osd_pool_set: ceph osd pool get pool_getset compression_algorithm 2026-03-31T20:35:05.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2403: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:35:05.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:35:05.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:35:05.966 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'compression_algorithm' is not set on pool 'pool_getset' 2026-03-31T20:35:05.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:35:05.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2404: test_mon_osd_pool_set: ceph osd pool set pool_getset compression_algorithm zlib 2026-03-31T20:35:07.656 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 compression_algorithm to zlib 2026-03-31T20:35:07.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2405: test_mon_osd_pool_set: ceph osd pool get pool_getset compression_algorithm 2026-03-31T20:35:07.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2405: test_mon_osd_pool_set: grep zlib 2026-03-31T20:35:07.883 INFO:tasks.workunit.client.0.vm03.stdout:compression_algorithm: zlib 2026-03-31T20:35:07.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2406: test_mon_osd_pool_set: ceph osd pool set pool_getset compression_algorithm unset 2026-03-31T20:35:09.681 INFO:tasks.workunit.client.0.vm03.stderr:unset pool 35 compression_algorithm 2026-03-31T20:35:09.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2407: test_mon_osd_pool_set: ceph osd pool get pool_getset compression_algorithm 2026-03-31T20:35:09.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2407: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:35:09.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:35:09.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:35:09.855 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'compression_algorithm' is not set on pool 'pool_getset' 2026-03-31T20:35:09.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:35:09.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2409: test_mon_osd_pool_set: ceph osd pool get pool_getset compression_required_ratio 2026-03-31T20:35:09.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2409: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:35:09.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:35:09.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:35:10.011 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'compression_required_ratio' is not set on pool 'pool_getset' 2026-03-31T20:35:10.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:35:10.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2410: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset compression_required_ratio 1.1 2026-03-31T20:35:10.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:35:10.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset compression_required_ratio 1.1 2026-03-31T20:35:10.169 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: compression_required_ratio is out of range (0-1): '1.1' 2026-03-31T20:35:10.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:35:10.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2411: test_mon_osd_pool_set: expect_false ceph osd pool set pool_getset compression_required_ratio -.2 2026-03-31T20:35:10.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:35:10.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set pool_getset compression_required_ratio -.2 2026-03-31T20:35:10.327 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: compression_required_ratio is out of range (0-1): '-.2' 2026-03-31T20:35:10.332 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:35:10.332 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2412: test_mon_osd_pool_set: ceph osd pool set pool_getset compression_required_ratio .2 2026-03-31T20:35:11.693 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 compression_required_ratio to .2 2026-03-31T20:35:11.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2413: test_mon_osd_pool_set: ceph osd pool get pool_getset compression_required_ratio 2026-03-31T20:35:11.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2413: test_mon_osd_pool_set: grep .2 2026-03-31T20:35:11.918 INFO:tasks.workunit.client.0.vm03.stdout:compression_required_ratio: 0.2 2026-03-31T20:35:11.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2414: test_mon_osd_pool_set: ceph osd pool set pool_getset compression_required_ratio 0 2026-03-31T20:35:13.711 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 compression_required_ratio to 0 2026-03-31T20:35:13.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2415: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:35:13.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2415: test_mon_osd_pool_set: ceph osd pool get pool_getset compression_required_ratio 2026-03-31T20:35:13.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:35:13.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:35:13.884 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'compression_required_ratio' is not set on pool 'pool_getset' 2026-03-31T20:35:13.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:35:13.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2417: test_mon_osd_pool_set: ceph osd pool get pool_getset csum_type 2026-03-31T20:35:13.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2417: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:35:13.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:35:13.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:35:14.046 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'csum_type' is not set on pool 'pool_getset' 2026-03-31T20:35:14.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:35:14.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2418: test_mon_osd_pool_set: ceph osd pool set pool_getset csum_type crc32c 2026-03-31T20:35:15.797 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 csum_type to crc32c 2026-03-31T20:35:15.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2419: test_mon_osd_pool_set: ceph osd pool get pool_getset csum_type 2026-03-31T20:35:15.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2419: test_mon_osd_pool_set: grep crc32c 2026-03-31T20:35:16.034 INFO:tasks.workunit.client.0.vm03.stdout:csum_type: crc32c 2026-03-31T20:35:16.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2420: test_mon_osd_pool_set: ceph osd pool set pool_getset csum_type unset 2026-03-31T20:35:17.821 INFO:tasks.workunit.client.0.vm03.stderr:unset pool 35 csum_type 2026-03-31T20:35:17.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2421: test_mon_osd_pool_set: ceph osd pool get pool_getset csum_type 2026-03-31T20:35:17.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2421: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:35:17.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:35:17.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:35:17.987 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'csum_type' is not set on pool 'pool_getset' 2026-03-31T20:35:17.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:35:17.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2423: test_mon_osd_pool_set: for size in compression_max_blob_size compression_min_blob_size csum_max_block csum_min_block 2026-03-31T20:35:17.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2424: test_mon_osd_pool_set: ceph osd pool get pool_getset compression_max_blob_size 2026-03-31T20:35:17.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2424: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:35:17.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:35:17.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:35:18.141 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'compression_max_blob_size' is not set on pool 'pool_getset' 2026-03-31T20:35:18.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:35:18.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2425: test_mon_osd_pool_set: ceph osd pool set pool_getset compression_max_blob_size 100 2026-03-31T20:35:19.827 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 compression_max_blob_size to 100 2026-03-31T20:35:19.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2426: test_mon_osd_pool_set: ceph osd pool get pool_getset compression_max_blob_size 2026-03-31T20:35:19.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2426: test_mon_osd_pool_set: grep 100 2026-03-31T20:35:20.058 INFO:tasks.workunit.client.0.vm03.stdout:compression_max_blob_size: 100 2026-03-31T20:35:20.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2427: test_mon_osd_pool_set: ceph osd pool set pool_getset compression_max_blob_size 0 2026-03-31T20:35:21.854 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 compression_max_blob_size to 0 2026-03-31T20:35:21.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2428: test_mon_osd_pool_set: ceph osd pool get pool_getset compression_max_blob_size 2026-03-31T20:35:21.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2428: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:35:21.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:35:21.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:35:22.021 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'compression_max_blob_size' is not set on pool 'pool_getset' 2026-03-31T20:35:22.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:35:22.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2423: test_mon_osd_pool_set: for size in compression_max_blob_size compression_min_blob_size csum_max_block csum_min_block 2026-03-31T20:35:22.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2424: test_mon_osd_pool_set: ceph osd pool get pool_getset compression_min_blob_size 2026-03-31T20:35:22.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2424: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:35:22.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:35:22.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:35:22.184 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'compression_min_blob_size' is not set on pool 'pool_getset' 2026-03-31T20:35:22.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:35:22.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2425: test_mon_osd_pool_set: ceph osd pool set pool_getset compression_min_blob_size 100 2026-03-31T20:35:23.861 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 compression_min_blob_size to 100 2026-03-31T20:35:23.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2426: test_mon_osd_pool_set: ceph osd pool get pool_getset compression_min_blob_size 2026-03-31T20:35:23.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2426: test_mon_osd_pool_set: grep 100 2026-03-31T20:35:24.082 INFO:tasks.workunit.client.0.vm03.stdout:compression_min_blob_size: 100 2026-03-31T20:35:24.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2427: test_mon_osd_pool_set: ceph osd pool set pool_getset compression_min_blob_size 0 2026-03-31T20:35:25.882 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 compression_min_blob_size to 0 2026-03-31T20:35:25.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2428: test_mon_osd_pool_set: ceph osd pool get pool_getset compression_min_blob_size 2026-03-31T20:35:25.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2428: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:35:25.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:35:25.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:35:26.037 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'compression_min_blob_size' is not set on pool 'pool_getset' 2026-03-31T20:35:26.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:35:26.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2423: test_mon_osd_pool_set: for size in compression_max_blob_size compression_min_blob_size csum_max_block csum_min_block 2026-03-31T20:35:26.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2424: test_mon_osd_pool_set: ceph osd pool get pool_getset csum_max_block 2026-03-31T20:35:26.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2424: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:35:26.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:35:26.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:35:26.185 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'csum_max_block' is not set on pool 'pool_getset' 2026-03-31T20:35:26.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:35:26.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2425: test_mon_osd_pool_set: ceph osd pool set pool_getset csum_max_block 100 2026-03-31T20:35:27.898 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 csum_max_block to 100 2026-03-31T20:35:27.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2426: test_mon_osd_pool_set: ceph osd pool get pool_getset csum_max_block 2026-03-31T20:35:27.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2426: test_mon_osd_pool_set: grep 100 2026-03-31T20:35:28.109 INFO:tasks.workunit.client.0.vm03.stdout:csum_max_block: 100 2026-03-31T20:35:28.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2427: test_mon_osd_pool_set: ceph osd pool set pool_getset csum_max_block 0 2026-03-31T20:35:29.904 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 csum_max_block to 0 2026-03-31T20:35:29.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2428: test_mon_osd_pool_set: ceph osd pool get pool_getset csum_max_block 2026-03-31T20:35:29.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2428: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:35:29.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:35:29.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:35:30.068 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'csum_max_block' is not set on pool 'pool_getset' 2026-03-31T20:35:30.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:35:30.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2423: test_mon_osd_pool_set: for size in compression_max_blob_size compression_min_blob_size csum_max_block csum_min_block 2026-03-31T20:35:30.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2424: test_mon_osd_pool_set: ceph osd pool get pool_getset csum_min_block 2026-03-31T20:35:30.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2424: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:35:30.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:35:30.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:35:30.218 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'csum_min_block' is not set on pool 'pool_getset' 2026-03-31T20:35:30.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:35:30.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2425: test_mon_osd_pool_set: ceph osd pool set pool_getset csum_min_block 100 2026-03-31T20:35:31.920 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 csum_min_block to 100 2026-03-31T20:35:31.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2426: test_mon_osd_pool_set: ceph osd pool get pool_getset csum_min_block 2026-03-31T20:35:31.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2426: test_mon_osd_pool_set: grep 100 2026-03-31T20:35:32.139 INFO:tasks.workunit.client.0.vm03.stdout:csum_min_block: 100 2026-03-31T20:35:32.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2427: test_mon_osd_pool_set: ceph osd pool set pool_getset csum_min_block 0 2026-03-31T20:35:33.941 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 csum_min_block to 0 2026-03-31T20:35:33.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2428: test_mon_osd_pool_set: ceph osd pool get pool_getset csum_min_block 2026-03-31T20:35:33.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2428: test_mon_osd_pool_set: expect_false grep . 2026-03-31T20:35:33.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:35:33.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep . 2026-03-31T20:35:34.098 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'csum_min_block' is not set on pool 'pool_getset' 2026-03-31T20:35:34.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:35:34.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2431: test_mon_osd_pool_set: ceph osd pool set pool_getset nodelete 1 2026-03-31T20:35:35.961 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 nodelete to 1 2026-03-31T20:35:35.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2432: test_mon_osd_pool_set: expect_false ceph osd pool delete pool_getset pool_getset --yes-i-really-really-mean-it 2026-03-31T20:35:35.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:35:35.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool delete pool_getset pool_getset --yes-i-really-really-mean-it 2026-03-31T20:35:36.130 INFO:tasks.workunit.client.0.vm03.stderr:Error EPERM: pool deletion is disabled; you must unset nodelete flag for the pool first 2026-03-31T20:35:36.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:35:36.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2433: test_mon_osd_pool_set: ceph osd pool set pool_getset nodelete 0 2026-03-31T20:35:37.975 INFO:tasks.workunit.client.0.vm03.stderr:set pool 35 nodelete to 0 2026-03-31T20:35:37.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2434: test_mon_osd_pool_set: ceph osd pool delete pool_getset pool_getset --yes-i-really-really-mean-it 2026-03-31T20:35:39.064 INFO:tasks.workunit.client.0.vm03.stderr:pool 'pool_getset' does not exist 2026-03-31T20:35:39.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:35:39.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mon_osd_tiered_pool_set 2026-03-31T20:35:39.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2441: test_mon_osd_tiered_pool_set: ceph osd pool create real-tier 2 2026-03-31T20:35:40.060 INFO:tasks.workunit.client.0.vm03.stderr:pool 'real-tier' already exists 2026-03-31T20:35:40.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2442: test_mon_osd_tiered_pool_set: ceph osd tier add rbd real-tier 2026-03-31T20:35:41.070 INFO:tasks.workunit.client.0.vm03.stderr:pool 'real-tier' is now (or already was) a tier of 'rbd' 2026-03-31T20:35:41.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2445: test_mon_osd_tiered_pool_set: for o in hit_set_period hit_set_count hit_set_fpp 2026-03-31T20:35:41.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2446: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set real_tier hit_set_period -1 2026-03-31T20:35:41.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:35:41.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set real_tier hit_set_period -1 2026-03-31T20:35:41.229 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: unrecognized pool 'real_tier' 2026-03-31T20:35:41.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:35:41.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2445: test_mon_osd_tiered_pool_set: for o in hit_set_period hit_set_count hit_set_fpp 2026-03-31T20:35:41.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2446: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set real_tier hit_set_count -1 2026-03-31T20:35:41.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:35:41.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set real_tier hit_set_count -1 2026-03-31T20:35:41.383 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: unrecognized pool 'real_tier' 2026-03-31T20:35:41.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:35:41.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2445: test_mon_osd_tiered_pool_set: for o in hit_set_period hit_set_count hit_set_fpp 2026-03-31T20:35:41.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2446: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set real_tier hit_set_fpp -1 2026-03-31T20:35:41.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:35:41.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set real_tier hit_set_fpp -1 2026-03-31T20:35:41.536 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: unrecognized pool 'real_tier' 2026-03-31T20:35:41.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:35:41.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2450: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set real_tier hit_set_fpp 2 2026-03-31T20:35:41.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:35:41.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set real_tier hit_set_fpp 2 2026-03-31T20:35:41.691 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: unrecognized pool 'real_tier' 2026-03-31T20:35:41.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:35:41.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2452: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier hit_set_type explicit_hash 2026-03-31T20:35:43.029 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 hit_set_type to explicit_hash 2026-03-31T20:35:43.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2453: test_mon_osd_tiered_pool_set: ceph osd pool get real-tier hit_set_type 2026-03-31T20:35:43.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2453: test_mon_osd_tiered_pool_set: grep 'hit_set_type: explicit_hash' 2026-03-31T20:35:43.243 INFO:tasks.workunit.client.0.vm03.stdout:hit_set_type: explicit_hash 2026-03-31T20:35:43.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2454: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier hit_set_type explicit_object 2026-03-31T20:35:45.083 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 hit_set_type to explicit_object 2026-03-31T20:35:45.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2455: test_mon_osd_tiered_pool_set: ceph osd pool get real-tier hit_set_type 2026-03-31T20:35:45.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2455: test_mon_osd_tiered_pool_set: grep 'hit_set_type: explicit_object' 2026-03-31T20:35:45.307 INFO:tasks.workunit.client.0.vm03.stdout:hit_set_type: explicit_object 2026-03-31T20:35:45.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2456: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier hit_set_type bloom 2026-03-31T20:35:47.092 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 hit_set_type to bloom 2026-03-31T20:35:47.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2457: test_mon_osd_tiered_pool_set: ceph osd pool get real-tier hit_set_type 2026-03-31T20:35:47.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2457: test_mon_osd_tiered_pool_set: grep 'hit_set_type: bloom' 2026-03-31T20:35:47.316 INFO:tasks.workunit.client.0.vm03.stdout:hit_set_type: bloom 2026-03-31T20:35:47.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2458: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set real-tier hit_set_type i_dont_exist 2026-03-31T20:35:47.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:35:47.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set real-tier hit_set_type i_dont_exist 2026-03-31T20:35:47.463 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: unrecognized hit_set type 'i_dont_exist' 2026-03-31T20:35:47.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:35:47.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2459: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier hit_set_period 123 2026-03-31T20:35:49.105 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 hit_set_period to 123 2026-03-31T20:35:49.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2460: test_mon_osd_tiered_pool_set: ceph osd pool get real-tier hit_set_period 2026-03-31T20:35:49.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2460: test_mon_osd_tiered_pool_set: grep 'hit_set_period: 123' 2026-03-31T20:35:49.331 INFO:tasks.workunit.client.0.vm03.stdout:hit_set_period: 123 2026-03-31T20:35:49.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2461: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier hit_set_count 12 2026-03-31T20:35:51.124 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 hit_set_count to 12 2026-03-31T20:35:51.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2462: test_mon_osd_tiered_pool_set: ceph osd pool get real-tier hit_set_count 2026-03-31T20:35:51.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2462: test_mon_osd_tiered_pool_set: grep 'hit_set_count: 12' 2026-03-31T20:35:51.344 INFO:tasks.workunit.client.0.vm03.stdout:hit_set_count: 12 2026-03-31T20:35:51.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2463: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier hit_set_fpp .01 2026-03-31T20:35:53.138 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 hit_set_fpp to .01 2026-03-31T20:35:53.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2464: test_mon_osd_tiered_pool_set: ceph osd pool get real-tier hit_set_fpp 2026-03-31T20:35:53.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2464: test_mon_osd_tiered_pool_set: grep 'hit_set_fpp: 0.01' 2026-03-31T20:35:53.367 INFO:tasks.workunit.client.0.vm03.stdout:hit_set_fpp: 0.01 2026-03-31T20:35:53.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2466: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier target_max_objects 123 2026-03-31T20:35:55.157 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 target_max_objects to 123 2026-03-31T20:35:55.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2467: test_mon_osd_tiered_pool_set: ceph osd pool get real-tier target_max_objects 2026-03-31T20:35:55.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2468: test_mon_osd_tiered_pool_set: grep 'target_max_objects:[ \t]\+123' 2026-03-31T20:35:55.381 INFO:tasks.workunit.client.0.vm03.stdout:target_max_objects: 123 2026-03-31T20:35:55.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2469: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier target_max_bytes 123456 2026-03-31T20:35:57.169 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 target_max_bytes to 123456 2026-03-31T20:35:57.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2470: test_mon_osd_tiered_pool_set: ceph osd pool get real-tier target_max_bytes 2026-03-31T20:35:57.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2471: test_mon_osd_tiered_pool_set: grep 'target_max_bytes:[ \t]\+123456' 2026-03-31T20:35:57.402 INFO:tasks.workunit.client.0.vm03.stdout:target_max_bytes: 123456 2026-03-31T20:35:57.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2472: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier cache_target_dirty_ratio .123 2026-03-31T20:35:59.184 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 cache_target_dirty_ratio to .123 2026-03-31T20:35:59.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2473: test_mon_osd_tiered_pool_set: ceph osd pool get real-tier cache_target_dirty_ratio 2026-03-31T20:35:59.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2474: test_mon_osd_tiered_pool_set: grep 'cache_target_dirty_ratio:[ \t]\+0.123' 2026-03-31T20:35:59.406 INFO:tasks.workunit.client.0.vm03.stdout:cache_target_dirty_ratio: 0.123 2026-03-31T20:35:59.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2475: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set real-tier cache_target_dirty_ratio -.2 2026-03-31T20:35:59.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:35:59.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set real-tier cache_target_dirty_ratio -.2 2026-03-31T20:35:59.564 INFO:tasks.workunit.client.0.vm03.stderr:Error ERANGE: value must be in the range 0..1 2026-03-31T20:35:59.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:35:59.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2476: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set real-tier cache_target_dirty_ratio 1.1 2026-03-31T20:35:59.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:35:59.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set real-tier cache_target_dirty_ratio 1.1 2026-03-31T20:35:59.719 INFO:tasks.workunit.client.0.vm03.stderr:Error ERANGE: value must be in the range 0..1 2026-03-31T20:35:59.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:35:59.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2477: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier cache_target_dirty_high_ratio .123 2026-03-31T20:36:01.200 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 cache_target_dirty_high_ratio to .123 2026-03-31T20:36:01.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2478: test_mon_osd_tiered_pool_set: ceph osd pool get real-tier cache_target_dirty_high_ratio 2026-03-31T20:36:01.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2479: test_mon_osd_tiered_pool_set: grep 'cache_target_dirty_high_ratio:[ \t]\+0.123' 2026-03-31T20:36:01.429 INFO:tasks.workunit.client.0.vm03.stdout:cache_target_dirty_high_ratio: 0.123 2026-03-31T20:36:01.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2480: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set real-tier cache_target_dirty_high_ratio -.2 2026-03-31T20:36:01.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:01.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set real-tier cache_target_dirty_high_ratio -.2 2026-03-31T20:36:01.586 INFO:tasks.workunit.client.0.vm03.stderr:Error ERANGE: value must be in the range 0..1 2026-03-31T20:36:01.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:01.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2481: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set real-tier cache_target_dirty_high_ratio 1.1 2026-03-31T20:36:01.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:01.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set real-tier cache_target_dirty_high_ratio 1.1 2026-03-31T20:36:01.744 INFO:tasks.workunit.client.0.vm03.stderr:Error ERANGE: value must be in the range 0..1 2026-03-31T20:36:01.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:01.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2482: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier cache_target_full_ratio .123 2026-03-31T20:36:03.209 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 cache_target_full_ratio to .123 2026-03-31T20:36:03.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2483: test_mon_osd_tiered_pool_set: ceph osd pool get real-tier cache_target_full_ratio 2026-03-31T20:36:03.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2484: test_mon_osd_tiered_pool_set: grep 'cache_target_full_ratio:[ \t]\+0.123' 2026-03-31T20:36:03.445 INFO:tasks.workunit.client.0.vm03.stdout:cache_target_full_ratio: 0.123 2026-03-31T20:36:03.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2485: test_mon_osd_tiered_pool_set: ceph osd dump -f json-pretty 2026-03-31T20:36:03.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2485: test_mon_osd_tiered_pool_set: grep '"cache_target_full_ratio_micro": 123000' 2026-03-31T20:36:03.661 INFO:tasks.workunit.client.0.vm03.stdout: "cache_target_full_ratio_micro": 123000, 2026-03-31T20:36:03.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2486: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier cache_target_full_ratio 1.0 2026-03-31T20:36:05.433 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 cache_target_full_ratio to 1.0 2026-03-31T20:36:05.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2487: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier cache_target_full_ratio 0 2026-03-31T20:36:07.441 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 cache_target_full_ratio to 0 2026-03-31T20:36:07.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2488: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set real-tier cache_target_full_ratio 1.1 2026-03-31T20:36:07.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:07.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set real-tier cache_target_full_ratio 1.1 2026-03-31T20:36:07.608 INFO:tasks.workunit.client.0.vm03.stderr:Error ERANGE: value must be in the range 0..1 2026-03-31T20:36:07.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:07.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2489: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier cache_min_flush_age 123 2026-03-31T20:36:09.456 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 cache_min_flush_age to 123 2026-03-31T20:36:09.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2490: test_mon_osd_tiered_pool_set: ceph osd pool get real-tier cache_min_flush_age 2026-03-31T20:36:09.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2491: test_mon_osd_tiered_pool_set: grep 'cache_min_flush_age:[ \t]\+123' 2026-03-31T20:36:09.693 INFO:tasks.workunit.client.0.vm03.stdout:cache_min_flush_age: 123 2026-03-31T20:36:09.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2492: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier cache_min_evict_age 234 2026-03-31T20:36:11.476 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 cache_min_evict_age to 234 2026-03-31T20:36:11.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2493: test_mon_osd_tiered_pool_set: ceph osd pool get real-tier cache_min_evict_age 2026-03-31T20:36:11.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2494: test_mon_osd_tiered_pool_set: grep 'cache_min_evict_age:[ \t]\+234' 2026-03-31T20:36:11.707 INFO:tasks.workunit.client.0.vm03.stdout:cache_min_evict_age: 234 2026-03-31T20:36:11.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2497: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier target_max_objects 1K 2026-03-31T20:36:13.485 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 target_max_objects to 1K 2026-03-31T20:36:13.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2498: test_mon_osd_tiered_pool_set: ceph osd pool get real-tier target_max_objects 2026-03-31T20:36:13.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2498: test_mon_osd_tiered_pool_set: grep 1000 2026-03-31T20:36:13.716 INFO:tasks.workunit.client.0.vm03.stdout:target_max_objects: 1000 2026-03-31T20:36:13.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2499: test_mon_osd_tiered_pool_set: for o in target_max_bytes target_size_bytes compression_max_blob_size compression_min_blob_size csum_max_block csum_min_block 2026-03-31T20:36:13.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2500: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier target_max_bytes 1Ki 2026-03-31T20:36:15.499 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 target_max_bytes to 1Ki 2026-03-31T20:36:15.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2501: test_mon_osd_tiered_pool_set: ceph osd pool get real-tier target_max_bytes --format=json 2026-03-31T20:36:15.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2501: test_mon_osd_tiered_pool_set: jq -c .target_max_bytes 2026-03-31T20:36:15.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2501: test_mon_osd_tiered_pool_set: val=1024 2026-03-31T20:36:15.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2502: test_mon_osd_tiered_pool_set: [[ 1024 == 1024 ]] 2026-03-31T20:36:15.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2503: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier target_max_bytes 1M 2026-03-31T20:36:17.525 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 target_max_bytes to 1M 2026-03-31T20:36:17.538 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2504: test_mon_osd_tiered_pool_set: ceph osd pool get real-tier target_max_bytes --format=json 2026-03-31T20:36:17.538 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2504: test_mon_osd_tiered_pool_set: jq -c .target_max_bytes 2026-03-31T20:36:17.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2504: test_mon_osd_tiered_pool_set: val=1048576 2026-03-31T20:36:17.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2505: test_mon_osd_tiered_pool_set: [[ 1048576 == 1048576 ]] 2026-03-31T20:36:17.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2499: test_mon_osd_tiered_pool_set: for o in target_max_bytes target_size_bytes compression_max_blob_size compression_min_blob_size csum_max_block csum_min_block 2026-03-31T20:36:17.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2500: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier target_size_bytes 1Ki 2026-03-31T20:36:19.531 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 target_size_bytes to 1Ki 2026-03-31T20:36:19.548 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2501: test_mon_osd_tiered_pool_set: ceph osd pool get real-tier target_size_bytes --format=json 2026-03-31T20:36:19.548 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2501: test_mon_osd_tiered_pool_set: jq -c .target_size_bytes 2026-03-31T20:36:19.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2501: test_mon_osd_tiered_pool_set: val=1024 2026-03-31T20:36:19.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2502: test_mon_osd_tiered_pool_set: [[ 1024 == 1024 ]] 2026-03-31T20:36:19.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2503: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier target_size_bytes 1M 2026-03-31T20:36:21.554 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 target_size_bytes to 1M 2026-03-31T20:36:21.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2504: test_mon_osd_tiered_pool_set: ceph osd pool get real-tier target_size_bytes --format=json 2026-03-31T20:36:21.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2504: test_mon_osd_tiered_pool_set: jq -c .target_size_bytes 2026-03-31T20:36:21.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2504: test_mon_osd_tiered_pool_set: val=1048576 2026-03-31T20:36:21.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2505: test_mon_osd_tiered_pool_set: [[ 1048576 == 1048576 ]] 2026-03-31T20:36:21.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2499: test_mon_osd_tiered_pool_set: for o in target_max_bytes target_size_bytes compression_max_blob_size compression_min_blob_size csum_max_block csum_min_block 2026-03-31T20:36:21.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2500: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier compression_max_blob_size 1Ki 2026-03-31T20:36:23.567 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 compression_max_blob_size to 1Ki 2026-03-31T20:36:23.582 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2501: test_mon_osd_tiered_pool_set: ceph osd pool get real-tier compression_max_blob_size --format=json 2026-03-31T20:36:23.582 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2501: test_mon_osd_tiered_pool_set: jq -c .compression_max_blob_size 2026-03-31T20:36:23.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2501: test_mon_osd_tiered_pool_set: val=1024 2026-03-31T20:36:23.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2502: test_mon_osd_tiered_pool_set: [[ 1024 == 1024 ]] 2026-03-31T20:36:23.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2503: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier compression_max_blob_size 1M 2026-03-31T20:36:25.579 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 compression_max_blob_size to 1M 2026-03-31T20:36:25.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2504: test_mon_osd_tiered_pool_set: ceph osd pool get real-tier compression_max_blob_size --format=json 2026-03-31T20:36:25.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2504: test_mon_osd_tiered_pool_set: jq -c .compression_max_blob_size 2026-03-31T20:36:25.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2504: test_mon_osd_tiered_pool_set: val=1048576 2026-03-31T20:36:25.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2505: test_mon_osd_tiered_pool_set: [[ 1048576 == 1048576 ]] 2026-03-31T20:36:25.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2499: test_mon_osd_tiered_pool_set: for o in target_max_bytes target_size_bytes compression_max_blob_size compression_min_blob_size csum_max_block csum_min_block 2026-03-31T20:36:25.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2500: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier compression_min_blob_size 1Ki 2026-03-31T20:36:27.593 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 compression_min_blob_size to 1Ki 2026-03-31T20:36:27.608 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2501: test_mon_osd_tiered_pool_set: ceph osd pool get real-tier compression_min_blob_size --format=json 2026-03-31T20:36:27.608 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2501: test_mon_osd_tiered_pool_set: jq -c .compression_min_blob_size 2026-03-31T20:36:27.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2501: test_mon_osd_tiered_pool_set: val=1024 2026-03-31T20:36:27.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2502: test_mon_osd_tiered_pool_set: [[ 1024 == 1024 ]] 2026-03-31T20:36:27.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2503: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier compression_min_blob_size 1M 2026-03-31T20:36:29.605 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 compression_min_blob_size to 1M 2026-03-31T20:36:29.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2504: test_mon_osd_tiered_pool_set: ceph osd pool get real-tier compression_min_blob_size --format=json 2026-03-31T20:36:29.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2504: test_mon_osd_tiered_pool_set: jq -c .compression_min_blob_size 2026-03-31T20:36:29.840 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2504: test_mon_osd_tiered_pool_set: val=1048576 2026-03-31T20:36:29.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2505: test_mon_osd_tiered_pool_set: [[ 1048576 == 1048576 ]] 2026-03-31T20:36:29.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2499: test_mon_osd_tiered_pool_set: for o in target_max_bytes target_size_bytes compression_max_blob_size compression_min_blob_size csum_max_block csum_min_block 2026-03-31T20:36:29.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2500: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier csum_max_block 1Ki 2026-03-31T20:36:31.616 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 csum_max_block to 1Ki 2026-03-31T20:36:31.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2501: test_mon_osd_tiered_pool_set: ceph osd pool get real-tier csum_max_block --format=json 2026-03-31T20:36:31.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2501: test_mon_osd_tiered_pool_set: jq -c .csum_max_block 2026-03-31T20:36:31.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2501: test_mon_osd_tiered_pool_set: val=1024 2026-03-31T20:36:31.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2502: test_mon_osd_tiered_pool_set: [[ 1024 == 1024 ]] 2026-03-31T20:36:31.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2503: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier csum_max_block 1M 2026-03-31T20:36:33.627 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 csum_max_block to 1M 2026-03-31T20:36:33.642 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2504: test_mon_osd_tiered_pool_set: ceph osd pool get real-tier csum_max_block --format=json 2026-03-31T20:36:33.642 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2504: test_mon_osd_tiered_pool_set: jq -c .csum_max_block 2026-03-31T20:36:33.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2504: test_mon_osd_tiered_pool_set: val=1048576 2026-03-31T20:36:33.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2505: test_mon_osd_tiered_pool_set: [[ 1048576 == 1048576 ]] 2026-03-31T20:36:33.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2499: test_mon_osd_tiered_pool_set: for o in target_max_bytes target_size_bytes compression_max_blob_size compression_min_blob_size csum_max_block csum_min_block 2026-03-31T20:36:33.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2500: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier csum_min_block 1Ki 2026-03-31T20:36:35.639 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 csum_min_block to 1Ki 2026-03-31T20:36:35.666 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2501: test_mon_osd_tiered_pool_set: ceph osd pool get real-tier csum_min_block --format=json 2026-03-31T20:36:35.666 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2501: test_mon_osd_tiered_pool_set: jq -c .csum_min_block 2026-03-31T20:36:35.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2501: test_mon_osd_tiered_pool_set: val=1024 2026-03-31T20:36:35.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2502: test_mon_osd_tiered_pool_set: [[ 1024 == 1024 ]] 2026-03-31T20:36:35.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2503: test_mon_osd_tiered_pool_set: ceph osd pool set real-tier csum_min_block 1M 2026-03-31T20:36:37.657 INFO:tasks.workunit.client.0.vm03.stderr:set pool 37 csum_min_block to 1M 2026-03-31T20:36:37.674 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2504: test_mon_osd_tiered_pool_set: ceph osd pool get real-tier csum_min_block --format=json 2026-03-31T20:36:37.674 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2504: test_mon_osd_tiered_pool_set: jq -c .csum_min_block 2026-03-31T20:36:37.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2504: test_mon_osd_tiered_pool_set: val=1048576 2026-03-31T20:36:37.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2505: test_mon_osd_tiered_pool_set: [[ 1048576 == 1048576 ]] 2026-03-31T20:36:37.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2509: test_mon_osd_tiered_pool_set: ceph osd pool create fake-tier 2 2026-03-31T20:36:38.721 INFO:tasks.workunit.client.0.vm03.stderr:pool 'fake-tier' already exists 2026-03-31T20:36:38.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2510: test_mon_osd_tiered_pool_set: ceph osd pool application enable fake-tier rados 2026-03-31T20:36:40.674 INFO:tasks.workunit.client.0.vm03.stderr:enabled application 'rados' on pool 'fake-tier' 2026-03-31T20:36:40.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2511: test_mon_osd_tiered_pool_set: wait_for_clean 2026-03-31T20:36:40.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1652: wait_for_clean: local cmd= 2026-03-31T20:36:40.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1653: wait_for_clean: local num_active_clean=-1 2026-03-31T20:36:40.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1654: wait_for_clean: local cur_active_clean 2026-03-31T20:36:40.690 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1655: wait_for_clean: get_timeout_delays 90 .1 2026-03-31T20:36:40.690 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: shopt -q -o xtrace 2026-03-31T20:36:40.690 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: echo true 2026-03-31T20:36:40.690 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1598: get_timeout_delays: local trace=true 2026-03-31T20:36:40.690 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: true 2026-03-31T20:36:40.690 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1599: get_timeout_delays: shopt -u -o xtrace 2026-03-31T20:36:40.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1655: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-31T20:36:40.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1655: wait_for_clean: local -a delays 2026-03-31T20:36:40.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1656: wait_for_clean: local -i loop=0 2026-03-31T20:36:40.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1658: wait_for_clean: flush_pg_stats 2026-03-31T20:36:40.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2258: flush_pg_stats: local timeout=300 2026-03-31T20:36:40.766 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2260: flush_pg_stats: ceph osd ls 2026-03-31T20:36:40.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2260: flush_pg_stats: ids='0 2026-03-31T20:36:40.966 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-31T20:36:40.966 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-31T20:36:40.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2261: flush_pg_stats: seqs= 2026-03-31T20:36:40.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:36:40.966 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.0 flush_pg_stats 2026-03-31T20:36:41.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=949187772618 2026-03-31T20:36:41.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 949187772618 2026-03-31T20:36:41.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-949187772618' 2026-03-31T20:36:41.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:36:41.047 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.1 flush_pg_stats 2026-03-31T20:36:41.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=34359738570 2026-03-31T20:36:41.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 34359738570 2026-03-31T20:36:41.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-949187772618 1-34359738570' 2026-03-31T20:36:41.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:36:41.128 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.2 flush_pg_stats 2026-03-31T20:36:41.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=34359738570 2026-03-31T20:36:41.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 34359738570 2026-03-31T20:36:41.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-949187772618 1-34359738570 2-34359738570' 2026-03-31T20:36:41.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:36:41.209 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 0-949187772618 2026-03-31T20:36:41.209 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:36:41.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=0 2026-03-31T20:36:41.211 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 0-949187772618 2026-03-31T20:36:41.211 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:36:41.212 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 949187772618 2026-03-31T20:36:41.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=949187772618 2026-03-31T20:36:41.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.0 seq 949187772618' 2026-03-31T20:36:41.212 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-31T20:36:41.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 949187772616 -lt 949187772618 2026-03-31T20:36:41.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2275: flush_pg_stats: sleep 1 2026-03-31T20:36:42.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2276: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-31T20:36:42.415 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-31T20:36:42.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 949187772616 -lt 949187772618 2026-03-31T20:36:42.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2275: flush_pg_stats: sleep 1 2026-03-31T20:36:43.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2276: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-31T20:36:43.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-31T20:36:43.836 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 949187772618 -lt 949187772618 2026-03-31T20:36:43.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:36:43.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 1-34359738570 2026-03-31T20:36:43.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:36:43.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=1 2026-03-31T20:36:43.838 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 1-34359738570 2026-03-31T20:36:43.838 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:36:43.840 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 34359738570 2026-03-31T20:36:43.840 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=34359738570 2026-03-31T20:36:43.840 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.1 seq 34359738570' 2026-03-31T20:36:43.840 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-31T20:36:44.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738570 -lt 34359738570 2026-03-31T20:36:44.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:36:44.049 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 2-34359738570 2026-03-31T20:36:44.050 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:36:44.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=2 2026-03-31T20:36:44.051 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 2-34359738570 2026-03-31T20:36:44.051 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:36:44.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=34359738570 2026-03-31T20:36:44.053 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 34359738570 2026-03-31T20:36:44.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.2 seq 34359738570' 2026-03-31T20:36:44.053 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-31T20:36:44.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738570 -lt 34359738570 2026-03-31T20:36:44.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1659: wait_for_clean: get_num_pgs 2026-03-31T20:36:44.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: ceph --format json status 2026-03-31T20:36:44.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: jq .pgmap.num_pgs 2026-03-31T20:36:44.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1659: wait_for_clean: test 13 == 0 2026-03-31T20:36:44.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1663: wait_for_clean: true 2026-03-31T20:36:44.529 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1667: wait_for_clean: get_num_active_clean 2026-03-31T20:36:44.529 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1360: get_num_active_clean: local expression 2026-03-31T20:36:44.529 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1361: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-31T20:36:44.529 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1362: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-31T20:36:44.529 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1363: get_num_active_clean: ceph --format json pg dump pgs 2026-03-31T20:36:44.529 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1364: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-31T20:36:44.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1667: wait_for_clean: cur_active_clean=13 2026-03-31T20:36:44.733 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1668: wait_for_clean: get_num_pgs 2026-03-31T20:36:44.733 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: ceph --format json status 2026-03-31T20:36:44.733 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1421: get_num_pgs: jq .pgmap.num_pgs 2026-03-31T20:36:45.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1668: wait_for_clean: test 13 = 13 2026-03-31T20:36:45.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1668: wait_for_clean: break 2026-03-31T20:36:45.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:1683: wait_for_clean: return 0 2026-03-31T20:36:45.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2513: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set fake-tier hit_set_type explicit_hash 2026-03-31T20:36:45.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:45.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set fake-tier hit_set_type explicit_hash 2026-03-31T20:36:45.150 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: (13) Permission denied 2026-03-31T20:36:45.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:45.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2514: test_mon_osd_tiered_pool_set: expect_false ceph osd pool get fake-tier hit_set_type 2026-03-31T20:36:45.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:45.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool get fake-tier hit_set_type 2026-03-31T20:36:45.303 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: pool 'fake-tier' is not a tier pool: variable not applicable 2026-03-31T20:36:45.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:45.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2515: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set fake-tier hit_set_type explicit_object 2026-03-31T20:36:45.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:45.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set fake-tier hit_set_type explicit_object 2026-03-31T20:36:45.451 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: (13) Permission denied 2026-03-31T20:36:45.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:45.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2516: test_mon_osd_tiered_pool_set: expect_false ceph osd pool get fake-tier hit_set_type 2026-03-31T20:36:45.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:45.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool get fake-tier hit_set_type 2026-03-31T20:36:45.601 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: pool 'fake-tier' is not a tier pool: variable not applicable 2026-03-31T20:36:45.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:45.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2517: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set fake-tier hit_set_type bloom 2026-03-31T20:36:45.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:45.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set fake-tier hit_set_type bloom 2026-03-31T20:36:45.762 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: (13) Permission denied 2026-03-31T20:36:45.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:45.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2518: test_mon_osd_tiered_pool_set: expect_false ceph osd pool get fake-tier hit_set_type 2026-03-31T20:36:45.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:45.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool get fake-tier hit_set_type 2026-03-31T20:36:45.915 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: pool 'fake-tier' is not a tier pool: variable not applicable 2026-03-31T20:36:45.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:45.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2519: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set fake-tier hit_set_type i_dont_exist 2026-03-31T20:36:45.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:45.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set fake-tier hit_set_type i_dont_exist 2026-03-31T20:36:46.069 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: (13) Permission denied 2026-03-31T20:36:46.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:46.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2520: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set fake-tier hit_set_period 123 2026-03-31T20:36:46.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:46.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set fake-tier hit_set_period 123 2026-03-31T20:36:46.229 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: (13) Permission denied 2026-03-31T20:36:46.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:46.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2521: test_mon_osd_tiered_pool_set: expect_false ceph osd pool get fake-tier hit_set_period 2026-03-31T20:36:46.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:46.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool get fake-tier hit_set_period 2026-03-31T20:36:46.384 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: pool 'fake-tier' is not a tier pool: variable not applicable 2026-03-31T20:36:46.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:46.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2522: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set fake-tier hit_set_count 12 2026-03-31T20:36:46.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:46.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set fake-tier hit_set_count 12 2026-03-31T20:36:46.534 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: (13) Permission denied 2026-03-31T20:36:46.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:46.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2523: test_mon_osd_tiered_pool_set: expect_false ceph osd pool get fake-tier hit_set_count 2026-03-31T20:36:46.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:46.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool get fake-tier hit_set_count 2026-03-31T20:36:46.683 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: pool 'fake-tier' is not a tier pool: variable not applicable 2026-03-31T20:36:46.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:46.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2524: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set fake-tier hit_set_fpp .01 2026-03-31T20:36:46.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:46.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set fake-tier hit_set_fpp .01 2026-03-31T20:36:46.835 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: (13) Permission denied 2026-03-31T20:36:46.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:46.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2525: test_mon_osd_tiered_pool_set: expect_false ceph osd pool get fake-tier hit_set_fpp 2026-03-31T20:36:46.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:46.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool get fake-tier hit_set_fpp 2026-03-31T20:36:46.994 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: pool 'fake-tier' is not a tier pool: variable not applicable 2026-03-31T20:36:46.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:46.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2527: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set fake-tier target_max_objects 123 2026-03-31T20:36:46.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:46.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set fake-tier target_max_objects 123 2026-03-31T20:36:47.148 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: (13) Permission denied 2026-03-31T20:36:47.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:47.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2528: test_mon_osd_tiered_pool_set: expect_false ceph osd pool get fake-tier target_max_objects 2026-03-31T20:36:47.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:47.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool get fake-tier target_max_objects 2026-03-31T20:36:47.301 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: pool 'fake-tier' is not a tier pool: variable not applicable 2026-03-31T20:36:47.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:47.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2529: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set fake-tier target_max_bytes 123456 2026-03-31T20:36:47.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:47.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set fake-tier target_max_bytes 123456 2026-03-31T20:36:47.454 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: (13) Permission denied 2026-03-31T20:36:47.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:47.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2530: test_mon_osd_tiered_pool_set: expect_false ceph osd pool get fake-tier target_max_bytes 2026-03-31T20:36:47.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:47.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool get fake-tier target_max_bytes 2026-03-31T20:36:47.601 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: pool 'fake-tier' is not a tier pool: variable not applicable 2026-03-31T20:36:47.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:47.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2531: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set fake-tier cache_target_dirty_ratio .123 2026-03-31T20:36:47.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:47.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set fake-tier cache_target_dirty_ratio .123 2026-03-31T20:36:47.751 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: (13) Permission denied 2026-03-31T20:36:47.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:47.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2532: test_mon_osd_tiered_pool_set: expect_false ceph osd pool get fake-tier cache_target_dirty_ratio 2026-03-31T20:36:47.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:47.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool get fake-tier cache_target_dirty_ratio 2026-03-31T20:36:47.904 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: pool 'fake-tier' is not a tier pool: variable not applicable 2026-03-31T20:36:47.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:47.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2533: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set fake-tier cache_target_dirty_ratio -.2 2026-03-31T20:36:47.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:47.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set fake-tier cache_target_dirty_ratio -.2 2026-03-31T20:36:48.057 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: (13) Permission denied 2026-03-31T20:36:48.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:48.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2534: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set fake-tier cache_target_dirty_ratio 1.1 2026-03-31T20:36:48.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:48.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set fake-tier cache_target_dirty_ratio 1.1 2026-03-31T20:36:48.205 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: (13) Permission denied 2026-03-31T20:36:48.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:48.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2535: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set fake-tier cache_target_dirty_high_ratio .123 2026-03-31T20:36:48.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:48.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set fake-tier cache_target_dirty_high_ratio .123 2026-03-31T20:36:48.354 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: (13) Permission denied 2026-03-31T20:36:48.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:48.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2536: test_mon_osd_tiered_pool_set: expect_false ceph osd pool get fake-tier cache_target_dirty_high_ratio 2026-03-31T20:36:48.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:48.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool get fake-tier cache_target_dirty_high_ratio 2026-03-31T20:36:48.502 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: pool 'fake-tier' is not a tier pool: variable not applicable 2026-03-31T20:36:48.505 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:48.505 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2537: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set fake-tier cache_target_dirty_high_ratio -.2 2026-03-31T20:36:48.505 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:48.505 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set fake-tier cache_target_dirty_high_ratio -.2 2026-03-31T20:36:48.649 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: (13) Permission denied 2026-03-31T20:36:48.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:48.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2538: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set fake-tier cache_target_dirty_high_ratio 1.1 2026-03-31T20:36:48.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:48.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set fake-tier cache_target_dirty_high_ratio 1.1 2026-03-31T20:36:48.799 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: (13) Permission denied 2026-03-31T20:36:48.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:48.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2539: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set fake-tier cache_target_full_ratio .123 2026-03-31T20:36:48.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:48.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set fake-tier cache_target_full_ratio .123 2026-03-31T20:36:48.951 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: (13) Permission denied 2026-03-31T20:36:48.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:48.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2540: test_mon_osd_tiered_pool_set: expect_false ceph osd pool get fake-tier cache_target_full_ratio 2026-03-31T20:36:48.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:48.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool get fake-tier cache_target_full_ratio 2026-03-31T20:36:49.102 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: pool 'fake-tier' is not a tier pool: variable not applicable 2026-03-31T20:36:49.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:49.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2541: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set fake-tier cache_target_full_ratio 1.0 2026-03-31T20:36:49.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:49.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set fake-tier cache_target_full_ratio 1.0 2026-03-31T20:36:49.252 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: (13) Permission denied 2026-03-31T20:36:49.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:49.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2542: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set fake-tier cache_target_full_ratio 0 2026-03-31T20:36:49.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:49.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set fake-tier cache_target_full_ratio 0 2026-03-31T20:36:49.407 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: (13) Permission denied 2026-03-31T20:36:49.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:49.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2543: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set fake-tier cache_target_full_ratio 1.1 2026-03-31T20:36:49.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:49.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set fake-tier cache_target_full_ratio 1.1 2026-03-31T20:36:49.567 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: (13) Permission denied 2026-03-31T20:36:49.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:49.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2544: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set fake-tier cache_min_flush_age 123 2026-03-31T20:36:49.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:49.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set fake-tier cache_min_flush_age 123 2026-03-31T20:36:49.730 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: (13) Permission denied 2026-03-31T20:36:49.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:49.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2545: test_mon_osd_tiered_pool_set: expect_false ceph osd pool get fake-tier cache_min_flush_age 2026-03-31T20:36:49.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:49.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool get fake-tier cache_min_flush_age 2026-03-31T20:36:49.889 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: pool 'fake-tier' is not a tier pool: variable not applicable 2026-03-31T20:36:49.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:49.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2546: test_mon_osd_tiered_pool_set: expect_false ceph osd pool set fake-tier cache_min_evict_age 234 2026-03-31T20:36:49.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:49.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool set fake-tier cache_min_evict_age 234 2026-03-31T20:36:50.045 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: (13) Permission denied 2026-03-31T20:36:50.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:50.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2547: test_mon_osd_tiered_pool_set: expect_false ceph osd pool get fake-tier cache_min_evict_age 2026-03-31T20:36:50.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:50.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd pool get fake-tier cache_min_evict_age 2026-03-31T20:36:50.199 INFO:tasks.workunit.client.0.vm03.stderr:Error EACCES: pool 'fake-tier' is not a tier pool: variable not applicable 2026-03-31T20:36:50.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:50.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2549: test_mon_osd_tiered_pool_set: ceph osd tier remove rbd real-tier 2026-03-31T20:36:50.856 INFO:tasks.workunit.client.0.vm03.stderr:pool 'real-tier' is now (or already was) not a tier of 'rbd' 2026-03-31T20:36:50.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2550: test_mon_osd_tiered_pool_set: ceph osd pool delete real-tier real-tier --yes-i-really-really-mean-it 2026-03-31T20:36:51.858 INFO:tasks.workunit.client.0.vm03.stderr:pool 'real-tier' does not exist 2026-03-31T20:36:51.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2551: test_mon_osd_tiered_pool_set: ceph osd pool delete fake-tier fake-tier --yes-i-really-really-mean-it 2026-03-31T20:36:52.865 INFO:tasks.workunit.client.0.vm03.stderr:pool 'fake-tier' does not exist 2026-03-31T20:36:52.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:36:53.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mon_osd_erasure_code 2026-03-31T20:36:53.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2557: test_mon_osd_erasure_code: ceph osd erasure-code-profile set fooprofile a=b c=d 2026-03-31T20:36:53.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2558: test_mon_osd_erasure_code: ceph osd erasure-code-profile set fooprofile a=b c=d 2026-03-31T20:36:54.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2559: test_mon_osd_erasure_code: expect_false ceph osd erasure-code-profile set fooprofile a=b c=d e=f 2026-03-31T20:36:54.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:54.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd erasure-code-profile set fooprofile a=b c=d e=f 2026-03-31T20:36:54.262 INFO:tasks.workunit.client.0.vm03.stderr:Error EPERM: will not override erasure code profile fooprofile because the existing profile {a=b,c=d,crush-device-class=,crush-failure-domain=osd,crush-num-failure-domains=0,crush-osds-per-failure-domain=0,crush-root=default,k=2,m=1,plugin=isa,technique=reed_sol_van} is different from the proposed profile {a=b,c=d,crush-device-class=,crush-failure-domain=osd,crush-num-failure-domains=0,crush-osds-per-failure-domain=0,crush-root=default,e=f,k=2,m=1,plugin=isa,technique=reed_sol_van} 2026-03-31T20:36:54.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:54.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2560: test_mon_osd_erasure_code: ceph osd erasure-code-profile set fooprofile a=b c=d e=f --force --yes-i-really-mean-it 2026-03-31T20:36:54.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2561: test_mon_osd_erasure_code: ceph osd erasure-code-profile set fooprofile a=b c=d e=f 2026-03-31T20:36:55.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2562: test_mon_osd_erasure_code: expect_false ceph osd erasure-code-profile set fooprofile a=b c=d e=f g=h 2026-03-31T20:36:55.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:55.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd erasure-code-profile set fooprofile a=b c=d e=f g=h 2026-03-31T20:36:55.250 INFO:tasks.workunit.client.0.vm03.stderr:Error EPERM: will not override erasure code profile fooprofile because the existing profile {a=b,c=d,crush-device-class=,crush-failure-domain=osd,crush-num-failure-domains=0,crush-osds-per-failure-domain=0,crush-root=default,e=f,k=2,m=1,plugin=isa,technique=reed_sol_van} is different from the proposed profile {a=b,c=d,crush-device-class=,crush-failure-domain=osd,crush-num-failure-domains=0,crush-osds-per-failure-domain=0,crush-root=default,e=f,g=h,k=2,m=1,plugin=isa,technique=reed_sol_van} 2026-03-31T20:36:55.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:55.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2564: test_mon_osd_erasure_code: expect_false ceph osd erasure-code-profile set barprofile ruleset-failure-domain=host 2026-03-31T20:36:55.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:55.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd erasure-code-profile set barprofile ruleset-failure-domain=host 2026-03-31T20:36:55.404 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: property 'ruleset-failure-domain' is no longer supported; try 'crush-failure-domain' instead 2026-03-31T20:36:55.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:55.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2565: test_mon_osd_erasure_code: ceph osd erasure-code-profile set barprofile crush-failure-domain=host 2026-03-31T20:36:55.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2567: test_mon_osd_erasure_code: ceph osd erasure-code-profile rm fooprofile 2026-03-31T20:36:56.894 INFO:tasks.workunit.client.0.vm03.stderr:erasure-code-profile fooprofile does not exist 2026-03-31T20:36:56.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2568: test_mon_osd_erasure_code: ceph osd erasure-code-profile rm barprofile 2026-03-31T20:36:57.894 INFO:tasks.workunit.client.0.vm03.stderr:erasure-code-profile barprofile does not exist 2026-03-31T20:36:57.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2571: test_mon_osd_erasure_code: expect_false ceph osd erasure-code-profile set badk k=1 m=1 2026-03-31T20:36:57.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:57.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd erasure-code-profile set badk k=1 m=1 2026-03-31T20:36:58.056 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: k=1 must be >= 2 2026-03-31T20:36:58.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:58.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2572: test_mon_osd_erasure_code: expect_false ceph osd erasure-code-profile set badk k=1 m=2 2026-03-31T20:36:58.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:58.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd erasure-code-profile set badk k=1 m=2 2026-03-31T20:36:58.209 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: k=1 must be >= 2 2026-03-31T20:36:58.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:58.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2573: test_mon_osd_erasure_code: expect_false ceph osd erasure-code-profile set badk k=0 m=2 2026-03-31T20:36:58.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:58.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd erasure-code-profile set badk k=0 m=2 2026-03-31T20:36:58.363 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: k=0 must be >= 2 2026-03-31T20:36:58.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:58.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2574: test_mon_osd_erasure_code: expect_false ceph osd erasure-code-profile set badk k=-1 m=2 2026-03-31T20:36:58.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:58.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd erasure-code-profile set badk k=-1 m=2 2026-03-31T20:36:58.514 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: k=-1 must be >= 2 2026-03-31T20:36:58.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:58.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2575: test_mon_osd_erasure_code: expect_false ceph osd erasure-code-profile set badm k=2 m=0 2026-03-31T20:36:58.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:58.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd erasure-code-profile set badm k=2 m=0 2026-03-31T20:36:58.670 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: m=0 must be >= 1 2026-03-31T20:36:58.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:58.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2576: test_mon_osd_erasure_code: expect_false ceph osd erasure-code-profile set badm k=2 m=-1 2026-03-31T20:36:58.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:36:58.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd erasure-code-profile set badm k=2 m=-1 2026-03-31T20:36:58.833 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: m=-1 must be >= 1 2026-03-31T20:36:58.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:36:58.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2577: test_mon_osd_erasure_code: ceph osd erasure-code-profile set good k=2 m=1 2026-03-31T20:36:59.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2578: test_mon_osd_erasure_code: ceph osd erasure-code-profile rm good 2026-03-31T20:37:00.913 INFO:tasks.workunit.client.0.vm03.stderr:erasure-code-profile good does not exist 2026-03-31T20:37:00.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:37:01.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mon_osd_misc 2026-03-31T20:37:01.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2583: test_mon_osd_misc: set +e 2026-03-31T20:37:01.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2586: test_mon_osd_misc: ceph osd map 2026-03-31T20:37:01.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2586: test_mon_osd_misc: check_response pool 22 22 2026-03-31T20:37:01.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string=pool 2026-03-31T20:37:01.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode=22 2026-03-31T20:37:01.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode=22 2026-03-31T20:37:01.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' 22 -a 22 '!=' 22 ']' 2026-03-31T20:37:01.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- pool /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:37:01.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2589: test_mon_osd_misc: ceph osd ls foo 2026-03-31T20:37:01.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2589: test_mon_osd_misc: check_response unused 22 22 2026-03-31T20:37:01.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string=unused 2026-03-31T20:37:01.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode=22 2026-03-31T20:37:01.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode=22 2026-03-31T20:37:01.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' 22 -a 22 '!=' 22 ']' 2026-03-31T20:37:01.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- unused /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:37:01.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2592: test_mon_osd_misc: ceph osd reweight-by-utilization 80 2026-03-31T20:37:01.570 INFO:tasks.ceph.mgr.x.vm03.stderr:2026-03-31T20:37:01.569+0000 7f026e475640 -1 mgr.server reply reply (22) Invalid argument You must give a percentage higher than 100. The reweighting threshold will be calculated as times . For example, an argument of 200 would reweight OSDs which are twice as utilized as the average OSD. 2026-03-31T20:37:01.570 INFO:tasks.ceph.mgr.x.vm03.stderr:FAILED reweight-by-pg 2026-03-31T20:37:01.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2592: test_mon_osd_misc: check_response 'higher than 100' 22 22 2026-03-31T20:37:01.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='higher than 100' 2026-03-31T20:37:01.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode=22 2026-03-31T20:37:01.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode=22 2026-03-31T20:37:01.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' 22 -a 22 '!=' 22 ']' 2026-03-31T20:37:01.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'higher than 100' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:37:01.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2594: test_mon_osd_misc: set -e 2026-03-31T20:37:01.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2596: test_mon_osd_misc: ceph config get mgr mon_reweight_min_bytes_per_osd 2026-03-31T20:37:01.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2596: test_mon_osd_misc: local old_bytes_per_osd=104857600 2026-03-31T20:37:01.786 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2597: test_mon_osd_misc: ceph config get mgr mon_reweight_min_pgs_per_osd 2026-03-31T20:37:01.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2597: test_mon_osd_misc: local old_pgs_per_osd=10 2026-03-31T20:37:01.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2601: test_mon_osd_misc: ceph config set mgr mon_reweight_min_bytes_per_osd 0 2026-03-31T20:37:02.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2602: test_mon_osd_misc: ceph config set mgr mon_reweight_min_pgs_per_osd 0 2026-03-31T20:37:02.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2603: test_mon_osd_misc: ceph osd reweight-by-utilization 110 2026-03-31T20:37:02.597 INFO:tasks.workunit.client.0.vm03.stdout:moved 0 / 18 (0%) 2026-03-31T20:37:02.597 INFO:tasks.workunit.client.0.vm03.stdout:avg 6 2026-03-31T20:37:02.597 INFO:tasks.workunit.client.0.vm03.stdout:stddev 2.44949 -> 2.44949 (expected baseline 2) 2026-03-31T20:37:02.597 INFO:tasks.workunit.client.0.vm03.stdout:min osd.2 with 3 -> 3 pgs (0.5 -> 0.5 * mean) 2026-03-31T20:37:02.597 INFO:tasks.workunit.client.0.vm03.stdout:max osd.1 with 9 -> 9 pgs (1.5 -> 1.5 * mean) 2026-03-31T20:37:02.597 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:37:02.597 INFO:tasks.workunit.client.0.vm03.stdout:oload 110 2026-03-31T20:37:02.597 INFO:tasks.workunit.client.0.vm03.stdout:max_change 0.05 2026-03-31T20:37:02.597 INFO:tasks.workunit.client.0.vm03.stdout:max_change_osds 4 2026-03-31T20:37:02.597 INFO:tasks.workunit.client.0.vm03.stdout:average_utilization 0.0003 2026-03-31T20:37:02.597 INFO:tasks.workunit.client.0.vm03.stdout:overload_utilization 0.0004 2026-03-31T20:37:02.598 INFO:tasks.workunit.client.0.vm03.stderr:no change 2026-03-31T20:37:02.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2604: test_mon_osd_misc: ceph osd reweight-by-utilization 110 .5 2026-03-31T20:37:02.799 INFO:tasks.workunit.client.0.vm03.stdout:moved 0 / 18 (0%) 2026-03-31T20:37:02.800 INFO:tasks.workunit.client.0.vm03.stdout:avg 6 2026-03-31T20:37:02.800 INFO:tasks.workunit.client.0.vm03.stdout:stddev 2.44949 -> 2.44949 (expected baseline 2) 2026-03-31T20:37:02.800 INFO:tasks.workunit.client.0.vm03.stdout:min osd.2 with 3 -> 3 pgs (0.5 -> 0.5 * mean) 2026-03-31T20:37:02.800 INFO:tasks.workunit.client.0.vm03.stdout:max osd.1 with 9 -> 9 pgs (1.5 -> 1.5 * mean) 2026-03-31T20:37:02.800 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:37:02.800 INFO:tasks.workunit.client.0.vm03.stdout:oload 110 2026-03-31T20:37:02.800 INFO:tasks.workunit.client.0.vm03.stdout:max_change 0.5 2026-03-31T20:37:02.800 INFO:tasks.workunit.client.0.vm03.stdout:max_change_osds 4 2026-03-31T20:37:02.800 INFO:tasks.workunit.client.0.vm03.stdout:average_utilization 0.0003 2026-03-31T20:37:02.800 INFO:tasks.workunit.client.0.vm03.stdout:overload_utilization 0.0004 2026-03-31T20:37:02.800 INFO:tasks.workunit.client.0.vm03.stderr:no change 2026-03-31T20:37:02.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2605: test_mon_osd_misc: expect_false ceph osd reweight-by-utilization 110 0 2026-03-31T20:37:02.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:37:02.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd reweight-by-utilization 110 0 2026-03-31T20:37:02.961 INFO:tasks.ceph.mgr.x.vm03.stderr:2026-03-31T20:37:02.961+0000 7f026e475640 -1 mgr.server reply reply (22) Invalid argument max_change 0 must be positive 2026-03-31T20:37:02.961 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: max_change 0 must be positive 2026-03-31T20:37:02.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:37:02.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2606: test_mon_osd_misc: expect_false ceph osd reweight-by-utilization 110 -0.1 2026-03-31T20:37:02.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:37:02.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd reweight-by-utilization 110 -0.1 2026-03-31T20:37:03.110 INFO:tasks.ceph.mgr.x.vm03.stderr:2026-03-31T20:37:03.109+0000 7f026e475640 -1 mgr.server reply reply (22) Invalid argument max_change -0.1 must be positive 2026-03-31T20:37:03.110 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: max_change -0.1 must be positive 2026-03-31T20:37:03.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:37:03.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2607: test_mon_osd_misc: ceph osd test-reweight-by-utilization 110 .5 --no-increasing 2026-03-31T20:37:03.306 INFO:tasks.workunit.client.0.vm03.stdout:moved 0 / 18 (0%) 2026-03-31T20:37:03.307 INFO:tasks.workunit.client.0.vm03.stdout:avg 6 2026-03-31T20:37:03.307 INFO:tasks.workunit.client.0.vm03.stdout:stddev 2.44949 -> 2.44949 (expected baseline 2) 2026-03-31T20:37:03.307 INFO:tasks.workunit.client.0.vm03.stdout:min osd.2 with 3 -> 3 pgs (0.5 -> 0.5 * mean) 2026-03-31T20:37:03.307 INFO:tasks.workunit.client.0.vm03.stdout:max osd.1 with 9 -> 9 pgs (1.5 -> 1.5 * mean) 2026-03-31T20:37:03.307 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:37:03.307 INFO:tasks.workunit.client.0.vm03.stdout:oload 110 2026-03-31T20:37:03.307 INFO:tasks.workunit.client.0.vm03.stdout:max_change 0.5 2026-03-31T20:37:03.307 INFO:tasks.workunit.client.0.vm03.stdout:max_change_osds 4 2026-03-31T20:37:03.307 INFO:tasks.workunit.client.0.vm03.stdout:average_utilization 0.0003 2026-03-31T20:37:03.307 INFO:tasks.workunit.client.0.vm03.stdout:overload_utilization 0.0004 2026-03-31T20:37:03.307 INFO:tasks.workunit.client.0.vm03.stderr:no change 2026-03-31T20:37:03.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2608: test_mon_osd_misc: ceph osd test-reweight-by-utilization 110 .5 4 --no-increasing 2026-03-31T20:37:03.509 INFO:tasks.workunit.client.0.vm03.stdout:moved 0 / 18 (0%) 2026-03-31T20:37:03.509 INFO:tasks.workunit.client.0.vm03.stdout:avg 6 2026-03-31T20:37:03.509 INFO:tasks.workunit.client.0.vm03.stdout:stddev 2.44949 -> 2.44949 (expected baseline 2) 2026-03-31T20:37:03.509 INFO:tasks.workunit.client.0.vm03.stdout:min osd.2 with 3 -> 3 pgs (0.5 -> 0.5 * mean) 2026-03-31T20:37:03.509 INFO:tasks.workunit.client.0.vm03.stdout:max osd.1 with 9 -> 9 pgs (1.5 -> 1.5 * mean) 2026-03-31T20:37:03.509 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:37:03.509 INFO:tasks.workunit.client.0.vm03.stdout:oload 110 2026-03-31T20:37:03.509 INFO:tasks.workunit.client.0.vm03.stdout:max_change 0.5 2026-03-31T20:37:03.509 INFO:tasks.workunit.client.0.vm03.stdout:max_change_osds 4 2026-03-31T20:37:03.509 INFO:tasks.workunit.client.0.vm03.stdout:average_utilization 0.0003 2026-03-31T20:37:03.509 INFO:tasks.workunit.client.0.vm03.stdout:overload_utilization 0.0004 2026-03-31T20:37:03.510 INFO:tasks.workunit.client.0.vm03.stderr:no change 2026-03-31T20:37:03.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2609: test_mon_osd_misc: expect_false ceph osd test-reweight-by-utilization 110 .5 0 --no-increasing 2026-03-31T20:37:03.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:37:03.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd test-reweight-by-utilization 110 .5 0 --no-increasing 2026-03-31T20:37:03.664 INFO:tasks.ceph.mgr.x.vm03.stderr:2026-03-31T20:37:03.665+0000 7f026e475640 -1 mgr.server reply reply (22) Invalid argument max_osds 0 must be positive 2026-03-31T20:37:03.664 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: max_osds 0 must be positive 2026-03-31T20:37:03.667 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:37:03.667 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2610: test_mon_osd_misc: expect_false ceph osd test-reweight-by-utilization 110 .5 -10 --no-increasing 2026-03-31T20:37:03.667 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:37:03.667 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd test-reweight-by-utilization 110 .5 -10 --no-increasing 2026-03-31T20:37:03.819 INFO:tasks.ceph.mgr.x.vm03.stderr:2026-03-31T20:37:03.817+0000 7f026e475640 -1 mgr.server reply reply (22) Invalid argument max_osds -10 must be positive 2026-03-31T20:37:03.819 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: max_osds -10 must be positive 2026-03-31T20:37:03.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:37:03.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2611: test_mon_osd_misc: ceph osd reweight-by-pg 110 2026-03-31T20:37:05.890 INFO:tasks.workunit.client.0.vm03.stdout:moved 0 / 18 (0%) 2026-03-31T20:37:05.890 INFO:tasks.workunit.client.0.vm03.stdout:avg 6 2026-03-31T20:37:05.890 INFO:tasks.workunit.client.0.vm03.stdout:stddev 2.44949 -> 2.44949 (expected baseline 2) 2026-03-31T20:37:05.890 INFO:tasks.workunit.client.0.vm03.stdout:min osd.2 with 3 -> 3 pgs (0.5 -> 0.5 * mean) 2026-03-31T20:37:05.890 INFO:tasks.workunit.client.0.vm03.stdout:max osd.1 with 9 -> 9 pgs (1.5 -> 1.5 * mean) 2026-03-31T20:37:05.890 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:37:05.890 INFO:tasks.workunit.client.0.vm03.stdout:oload 110 2026-03-31T20:37:05.890 INFO:tasks.workunit.client.0.vm03.stdout:max_change 0.05 2026-03-31T20:37:05.890 INFO:tasks.workunit.client.0.vm03.stdout:max_change_osds 4 2026-03-31T20:37:05.890 INFO:tasks.workunit.client.0.vm03.stdout:average_utilization 68.2667 2026-03-31T20:37:05.890 INFO:tasks.workunit.client.0.vm03.stdout:overload_utilization 75.0933 2026-03-31T20:37:05.890 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 weight 0.9500 -> 0.9000 2026-03-31T20:37:05.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2612: test_mon_osd_misc: ceph osd test-reweight-by-pg 110 .5 2026-03-31T20:37:06.063 INFO:tasks.ceph.mgr.x.vm03.stderr:2026-03-31T20:37:06.061+0000 7f026e475640 -1 mgr.server reply reply (1) Operation not permitted no change 2026-03-31T20:37:06.063 INFO:tasks.workunit.client.0.vm03.stdout:moved 4 / 18 (22.2222%) 2026-03-31T20:37:06.063 INFO:tasks.workunit.client.0.vm03.stdout:avg 6 2026-03-31T20:37:06.063 INFO:tasks.workunit.client.0.vm03.stdout:stddev 2.44949 -> 1.41421 (expected baseline 2) 2026-03-31T20:37:06.063 INFO:tasks.workunit.client.0.vm03.stdout:min osd.2 with 3 -> 5 pgs (0.5 -> 0.833333 * mean) 2026-03-31T20:37:06.063 INFO:tasks.workunit.client.0.vm03.stdout:max osd.1 with 9 -> 5 pgs (1.5 -> 0.833333 * mean) 2026-03-31T20:37:06.063 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:37:06.063 INFO:tasks.workunit.client.0.vm03.stdout:oload 110 2026-03-31T20:37:06.063 INFO:tasks.workunit.client.0.vm03.stdout:max_change 0.5 2026-03-31T20:37:06.063 INFO:tasks.workunit.client.0.vm03.stdout:max_change_osds 4 2026-03-31T20:37:06.063 INFO:tasks.workunit.client.0.vm03.stdout:average_utilization 68.2667 2026-03-31T20:37:06.063 INFO:tasks.workunit.client.0.vm03.stdout:overload_utilization 75.0933 2026-03-31T20:37:06.063 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 weight 0.9000 -> 0.6000 2026-03-31T20:37:06.063 INFO:tasks.workunit.client.0.vm03.stderr:no change 2026-03-31T20:37:06.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2613: test_mon_osd_misc: ceph osd reweight-by-pg 110 rbd 2026-03-31T20:37:07.896 INFO:tasks.workunit.client.0.vm03.stdout:moved 0 / 16 (0%) 2026-03-31T20:37:07.896 INFO:tasks.workunit.client.0.vm03.stdout:avg 5.33333 2026-03-31T20:37:07.896 INFO:tasks.workunit.client.0.vm03.stdout:stddev 1.24722 -> 1.24722 (expected baseline 1.88562) 2026-03-31T20:37:07.896 INFO:tasks.workunit.client.0.vm03.stdout:min osd.2 with 4 -> 4 pgs (0.75 -> 0.75 * mean) 2026-03-31T20:37:07.896 INFO:tasks.workunit.client.0.vm03.stdout:max osd.1 with 7 -> 7 pgs (1.3125 -> 1.3125 * mean) 2026-03-31T20:37:07.896 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:37:07.896 INFO:tasks.workunit.client.0.vm03.stdout:oload 110 2026-03-31T20:37:07.896 INFO:tasks.workunit.client.0.vm03.stdout:max_change 0.05 2026-03-31T20:37:07.896 INFO:tasks.workunit.client.0.vm03.stdout:max_change_osds 4 2026-03-31T20:37:07.896 INFO:tasks.workunit.client.0.vm03.stdout:average_utilization 60.6815 2026-03-31T20:37:07.896 INFO:tasks.workunit.client.0.vm03.stdout:overload_utilization 66.7496 2026-03-31T20:37:07.896 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 weight 0.8500 -> 0.8000 2026-03-31T20:37:07.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2614: test_mon_osd_misc: ceph osd reweight-by-pg 110 .5 rbd 2026-03-31T20:37:09.916 INFO:tasks.workunit.client.0.vm03.stdout:moved 1 / 16 (6.25%) 2026-03-31T20:37:09.916 INFO:tasks.workunit.client.0.vm03.stdout:avg 5.33333 2026-03-31T20:37:09.916 INFO:tasks.workunit.client.0.vm03.stdout:stddev 1.69967 -> 2.35702 (expected baseline 1.88562) 2026-03-31T20:37:09.916 INFO:tasks.workunit.client.0.vm03.stdout:min osd.1 with 3 -> 2 pgs (0.5625 -> 0.375 * mean) 2026-03-31T20:37:09.916 INFO:tasks.workunit.client.0.vm03.stdout:max osd.0 with 7 -> 7 pgs (1.3125 -> 1.3125 * mean) 2026-03-31T20:37:09.916 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:37:09.916 INFO:tasks.workunit.client.0.vm03.stdout:oload 110 2026-03-31T20:37:09.916 INFO:tasks.workunit.client.0.vm03.stdout:max_change 0.5 2026-03-31T20:37:09.916 INFO:tasks.workunit.client.0.vm03.stdout:max_change_osds 4 2026-03-31T20:37:09.916 INFO:tasks.workunit.client.0.vm03.stdout:average_utilization 60.6815 2026-03-31T20:37:09.916 INFO:tasks.workunit.client.0.vm03.stdout:overload_utilization 66.7496 2026-03-31T20:37:09.917 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 weight 0.5334 -> 0.3556 2026-03-31T20:37:09.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2615: test_mon_osd_misc: expect_false ceph osd reweight-by-pg 110 boguspoolasdfasdfasdf 2026-03-31T20:37:09.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:37:09.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph osd reweight-by-pg 110 boguspoolasdfasdfasdf 2026-03-31T20:37:10.093 INFO:tasks.ceph.mgr.x.vm03.stderr:2026-03-31T20:37:10.093+0000 7f026e475640 -1 mgr.server reply reply (2) No such file or directory pool 'boguspoolasdfasdfasdf' does not exist 2026-03-31T20:37:10.094 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: pool 'boguspoolasdfasdfasdf' does not exist 2026-03-31T20:37:10.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:37:10.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2617: test_mon_osd_misc: ceph config set mgr mon_reweight_min_bytes_per_osd 104857600 2026-03-31T20:37:10.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2618: test_mon_osd_misc: ceph config set mgr mon_reweight_min_pgs_per_osd 10 2026-03-31T20:37:10.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:37:10.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mon_tell 2026-03-31T20:37:10.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2691: test_mon_tell: for m in mon.a mon.b 2026-03-31T20:37:10.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2692: test_mon_tell: ceph tell mon.a sessions 2026-03-31T20:37:10.837 INFO:tasks.workunit.client.0.vm03.stdout:[ 2026-03-31T20:37:10.837 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "name": "mon.2", 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "", 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3302", 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6791", 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3302", 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "mon", 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow *" 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 0, 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "none", 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 0, 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "" 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "name": "mon.0", 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "", 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:10.838 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3300", 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6789", 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "type": "none", 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "(unrecognized address family 0)", 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "mon", 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow *" 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 0, 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "none", 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 0, 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "" 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "name": "mon.1", 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "", 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3301", 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6790", 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3301", 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "mon", 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow *" 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 0, 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "none", 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 0, 2026-03-31T20:37:10.839 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "" 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "name": "osd.0", 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "osd.0", 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6804", 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 950776786 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6804", 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 950776786 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "osd", 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow profile osd" 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 4164, 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "reclaim_ok", 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 710, 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "vm03" 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "name": "client.?", 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "mgr.x", 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "type": "any", 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 2742811205 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "type": "any", 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 2742811205 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "client", 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow profile mgr" 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 6663, 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "reclaim_ok", 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 704, 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "vm03" 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "name": "client.?", 2026-03-31T20:37:10.840 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "mgr.x", 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "type": "any", 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 1428989468 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "type": "any", 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 1428989468 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "client", 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow profile mgr" 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 6702, 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "reclaim_ok", 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 705, 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "vm03" 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "name": "client.11609", 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "client.admin", 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "type": "any", 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 3301593755 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "type": "any", 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 3301593755 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "client", 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow *" 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 14421, 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "new_ok", 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 0, 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "" 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:10.841 INFO:tasks.workunit.client.0.vm03.stdout:] 2026-03-31T20:37:10.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2693: test_mon_tell: ceph_watch_start debug audit 2026-03-31T20:37:10.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:155: ceph_watch_start: local whatch_opt=--watch 2026-03-31T20:37:10.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:157: ceph_watch_start: '[' -n debug ']' 2026-03-31T20:37:10.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:158: ceph_watch_start: whatch_opt=--watch-debug 2026-03-31T20:37:10.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:159: ceph_watch_start: '[' -n audit ']' 2026-03-31T20:37:10.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:160: ceph_watch_start: whatch_opt+=' --watch-channel audit' 2026-03-31T20:37:10.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:164: ceph_watch_start: CEPH_WATCH_FILE=/tmp/cephtool.sYl/CEPH_WATCH_26274 2026-03-31T20:37:10.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:166: ceph_watch_start: CEPH_WATCH_PID=63915 2026-03-31T20:37:10.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:165: ceph_watch_start: ceph --watch-debug --watch-channel audit 2026-03-31T20:37:10.849 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:170: ceph_watch_start: seq 3 2026-03-31T20:37:10.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:170: ceph_watch_start: for i in `seq 3` 2026-03-31T20:37:10.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:171: ceph_watch_start: grep -q cluster /tmp/cephtool.sYl/CEPH_WATCH_26274 2026-03-31T20:37:10.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:172: ceph_watch_start: sleep 1 2026-03-31T20:37:11.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:170: ceph_watch_start: for i in `seq 3` 2026-03-31T20:37:11.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:171: ceph_watch_start: grep -q cluster /tmp/cephtool.sYl/CEPH_WATCH_26274 2026-03-31T20:37:11.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:171: ceph_watch_start: break 2026-03-31T20:37:11.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2694: test_mon_tell: ceph tell mon.a sessions 2026-03-31T20:37:11.919 INFO:tasks.workunit.client.0.vm03.stdout:[ 2026-03-31T20:37:11.919 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:11.919 INFO:tasks.workunit.client.0.vm03.stdout: "name": "mon.2", 2026-03-31T20:37:11.919 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "", 2026-03-31T20:37:11.919 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:11.919 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:11.919 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:11.919 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:11.919 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3302", 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6791", 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3302", 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "mon", 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow *" 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 0, 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "none", 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 0, 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "" 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "name": "mon.0", 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "", 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3300", 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6789", 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "type": "none", 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "(unrecognized address family 0)", 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "mon", 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow *" 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 0, 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "none", 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 0, 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "" 2026-03-31T20:37:11.920 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "name": "mon.1", 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "", 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3301", 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6790", 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3301", 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "mon", 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow *" 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 0, 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "none", 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 0, 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "" 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "name": "osd.0", 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "osd.0", 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6804", 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 950776786 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6804", 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 950776786 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "osd", 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow profile osd" 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 4164, 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "reclaim_ok", 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 711, 2026-03-31T20:37:11.921 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "vm03" 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "name": "client.?", 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "mgr.x", 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "type": "any", 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 2742811205 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "type": "any", 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 2742811205 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "client", 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow profile mgr" 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 6663, 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "reclaim_ok", 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 704, 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "vm03" 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "name": "client.?", 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "mgr.x", 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "type": "any", 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 1428989468 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "type": "any", 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 1428989468 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "client", 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow profile mgr" 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 6702, 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "reclaim_ok", 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 705, 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "vm03" 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.922 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "name": "client.?", 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "client.admin", 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "type": "any", 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 273774567 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "type": "any", 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 273774567 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "client", 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow *" 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 14436, 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "new_ok", 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 0, 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "vm03" 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "name": "client.14436", 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "client.admin", 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "type": "any", 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 273774567 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "type": "any", 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 273774567 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "client", 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow *" 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 14445, 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "new_ok", 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 0, 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "" 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:11.923 INFO:tasks.workunit.client.0.vm03.stdout:] 2026-03-31T20:37:11.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2695: test_mon_tell: ceph_watch_wait 'mon.a \[DBG\] from.*cmd='\''sessions'\'' args=\[\]: dispatch' 2026-03-31T20:37:11.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:178: ceph_watch_wait: local 'regexp=mon.a \[DBG\] from.*cmd='\''sessions'\'' args=\[\]: dispatch' 2026-03-31T20:37:11.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:179: ceph_watch_wait: local timeout=30 2026-03-31T20:37:11.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:181: ceph_watch_wait: '[' -n '' ']' 2026-03-31T20:37:11.928 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:185: ceph_watch_wait: seq 30 2026-03-31T20:37:11.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:185: ceph_watch_wait: for i in `seq ${timeout}` 2026-03-31T20:37:11.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:186: ceph_watch_wait: grep -q 'mon.a \[DBG\] from.*cmd='\''sessions'\'' args=\[\]: dispatch' /tmp/cephtool.sYl/CEPH_WATCH_26274 2026-03-31T20:37:11.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:186: ceph_watch_wait: break 2026-03-31T20:37:11.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:190: ceph_watch_wait: kill 63915 2026-03-31T20:37:11.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:192: ceph_watch_wait: grep 'mon.a \[DBG\] from.*cmd='\''sessions'\'' args=\[\]: dispatch' /tmp/cephtool.sYl/CEPH_WATCH_26274 2026-03-31T20:37:11.933 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-31T20:37:10.836030+0000 mon.a [DBG] from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch 2026-03-31T20:37:11.933 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-31T20:37:10.839261+0000 mon.a [DBG] from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch 2026-03-31T20:37:11.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2691: test_mon_tell: for m in mon.a mon.b 2026-03-31T20:37:11.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2692: test_mon_tell: ceph tell mon.b sessions 2026-03-31T20:37:12.003 INFO:tasks.workunit.client.0.vm03.stdout:[ 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "name": "mon.2", 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "", 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3302", 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6791", 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3302", 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "mon", 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow *" 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 0, 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "none", 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 0, 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "" 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "name": "mon.1", 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "", 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3301", 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:12.004 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6790", 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "type": "none", 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "(unrecognized address family 0)", 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "mon", 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow *" 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 0, 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "none", 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 0, 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "" 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "name": "mon.0", 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "", 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3300", 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6789", 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3300", 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "mon", 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow *" 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 0, 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "none", 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 0, 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "" 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "name": "osd.2", 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "osd.2", 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:12.005 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6808", 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 1523795840 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6808", 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 1523795840 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "osd", 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow profile osd" 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 4144, 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "reclaim_ok", 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 710, 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "vm03" 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "name": "mgr.5953", 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "mgr.x", 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "type": "any", 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 832516962 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "type": "any", 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 832516962 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "mgr", 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow profile mgr" 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 5953, 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "reclaim_ok", 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 711, 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "vm03" 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "name": "client.11621", 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "client.admin", 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "type": "any", 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:37:12.006 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 772360210 2026-03-31T20:37:12.007 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:12.007 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:12.007 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:12.007 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:12.007 INFO:tasks.workunit.client.0.vm03.stdout: "type": "any", 2026-03-31T20:37:12.007 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:37:12.007 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 772360210 2026-03-31T20:37:12.007 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:12.007 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "client", 2026-03-31T20:37:12.007 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:12.007 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:12.007 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:12.007 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:12.007 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:12.007 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow *" 2026-03-31T20:37:12.007 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:12.007 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:12.007 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 11605, 2026-03-31T20:37:12.007 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "new_ok", 2026-03-31T20:37:12.007 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 0, 2026-03-31T20:37:12.007 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "" 2026-03-31T20:37:12.007 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:12.007 INFO:tasks.workunit.client.0.vm03.stdout:] 2026-03-31T20:37:12.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2693: test_mon_tell: ceph_watch_start debug audit 2026-03-31T20:37:12.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:155: ceph_watch_start: local whatch_opt=--watch 2026-03-31T20:37:12.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:157: ceph_watch_start: '[' -n debug ']' 2026-03-31T20:37:12.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:158: ceph_watch_start: whatch_opt=--watch-debug 2026-03-31T20:37:12.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:159: ceph_watch_start: '[' -n audit ']' 2026-03-31T20:37:12.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:160: ceph_watch_start: whatch_opt+=' --watch-channel audit' 2026-03-31T20:37:12.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:164: ceph_watch_start: CEPH_WATCH_FILE=/tmp/cephtool.sYl/CEPH_WATCH_26274 2026-03-31T20:37:12.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:166: ceph_watch_start: CEPH_WATCH_PID=63992 2026-03-31T20:37:12.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:165: ceph_watch_start: ceph --watch-debug --watch-channel audit 2026-03-31T20:37:12.013 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:170: ceph_watch_start: seq 3 2026-03-31T20:37:12.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:170: ceph_watch_start: for i in `seq 3` 2026-03-31T20:37:12.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:171: ceph_watch_start: grep -q cluster /tmp/cephtool.sYl/CEPH_WATCH_26274 2026-03-31T20:37:12.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:172: ceph_watch_start: sleep 1 2026-03-31T20:37:13.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:170: ceph_watch_start: for i in `seq 3` 2026-03-31T20:37:13.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:171: ceph_watch_start: grep -q cluster /tmp/cephtool.sYl/CEPH_WATCH_26274 2026-03-31T20:37:13.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:171: ceph_watch_start: break 2026-03-31T20:37:13.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2694: test_mon_tell: ceph tell mon.a sessions 2026-03-31T20:37:13.085 INFO:tasks.workunit.client.0.vm03.stdout:[ 2026-03-31T20:37:13.085 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.085 INFO:tasks.workunit.client.0.vm03.stdout: "name": "mon.2", 2026-03-31T20:37:13.085 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "", 2026-03-31T20:37:13.085 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:13.085 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:13.085 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.085 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:13.085 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3302", 2026-03-31T20:37:13.085 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:13.085 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.085 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.085 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:13.085 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6791", 2026-03-31T20:37:13.085 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:13.085 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.085 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:13.085 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.085 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:13.085 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:13.085 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3302", 2026-03-31T20:37:13.085 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "mon", 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow *" 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 0, 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "none", 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 0, 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "" 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "name": "mon.0", 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "", 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3300", 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6789", 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "type": "none", 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "(unrecognized address family 0)", 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "mon", 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow *" 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 0, 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "none", 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 0, 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "" 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "name": "mon.1", 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "", 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3301", 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.086 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6790", 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3301", 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "mon", 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow *" 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 0, 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "none", 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 0, 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "" 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "name": "osd.0", 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "osd.0", 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6804", 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 950776786 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6804", 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 950776786 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "osd", 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow profile osd" 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 4164, 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "reclaim_ok", 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 711, 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "vm03" 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "name": "client.?", 2026-03-31T20:37:13.087 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "mgr.x", 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "type": "any", 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 2742811205 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "type": "any", 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 2742811205 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "client", 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow profile mgr" 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 6663, 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "reclaim_ok", 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 704, 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "vm03" 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "name": "client.?", 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "mgr.x", 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "type": "any", 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 1428989468 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "type": "any", 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 1428989468 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "client", 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow profile mgr" 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 6702, 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "reclaim_ok", 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 705, 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "vm03" 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "name": "client.11633", 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "entity_name": "client.admin", 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "addrs": { 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:13.088 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.089 INFO:tasks.workunit.client.0.vm03.stdout: "type": "any", 2026-03-31T20:37:13.089 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:37:13.089 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 1460428334 2026-03-31T20:37:13.089 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.089 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:13.089 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.089 INFO:tasks.workunit.client.0.vm03.stdout: "socket_addr": { 2026-03-31T20:37:13.089 INFO:tasks.workunit.client.0.vm03.stdout: "type": "any", 2026-03-31T20:37:13.089 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:0", 2026-03-31T20:37:13.089 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 1460428334 2026-03-31T20:37:13.089 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.089 INFO:tasks.workunit.client.0.vm03.stdout: "con_type": "client", 2026-03-31T20:37:13.089 INFO:tasks.workunit.client.0.vm03.stdout: "con_features": 4541880224203014143, 2026-03-31T20:37:13.089 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_hex": "3f07fffffffdffff", 2026-03-31T20:37:13.089 INFO:tasks.workunit.client.0.vm03.stdout: "con_features_release": "squid", 2026-03-31T20:37:13.089 INFO:tasks.workunit.client.0.vm03.stdout: "open": true, 2026-03-31T20:37:13.089 INFO:tasks.workunit.client.0.vm03.stdout: "caps": { 2026-03-31T20:37:13.089 INFO:tasks.workunit.client.0.vm03.stdout: "text": "allow *" 2026-03-31T20:37:13.089 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.089 INFO:tasks.workunit.client.0.vm03.stdout: "authenticated": true, 2026-03-31T20:37:13.089 INFO:tasks.workunit.client.0.vm03.stdout: "global_id": 14475, 2026-03-31T20:37:13.089 INFO:tasks.workunit.client.0.vm03.stdout: "global_id_status": "new_ok", 2026-03-31T20:37:13.089 INFO:tasks.workunit.client.0.vm03.stdout: "osd_epoch": 0, 2026-03-31T20:37:13.089 INFO:tasks.workunit.client.0.vm03.stdout: "remote_host": "" 2026-03-31T20:37:13.089 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.089 INFO:tasks.workunit.client.0.vm03.stdout:] 2026-03-31T20:37:13.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2695: test_mon_tell: ceph_watch_wait 'mon.b \[DBG\] from.*cmd='\''sessions'\'' args=\[\]: dispatch' 2026-03-31T20:37:13.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:178: ceph_watch_wait: local 'regexp=mon.b \[DBG\] from.*cmd='\''sessions'\'' args=\[\]: dispatch' 2026-03-31T20:37:13.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:179: ceph_watch_wait: local timeout=30 2026-03-31T20:37:13.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:181: ceph_watch_wait: '[' -n '' ']' 2026-03-31T20:37:13.093 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:185: ceph_watch_wait: seq 30 2026-03-31T20:37:13.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:185: ceph_watch_wait: for i in `seq ${timeout}` 2026-03-31T20:37:13.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:186: ceph_watch_wait: grep -q 'mon.b \[DBG\] from.*cmd='\''sessions'\'' args=\[\]: dispatch' /tmp/cephtool.sYl/CEPH_WATCH_26274 2026-03-31T20:37:13.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:186: ceph_watch_wait: break 2026-03-31T20:37:13.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:190: ceph_watch_wait: kill 63992 2026-03-31T20:37:13.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:192: ceph_watch_wait: grep 'mon.b \[DBG\] from.*cmd='\''sessions'\'' args=\[\]: dispatch' /tmp/cephtool.sYl/CEPH_WATCH_26274 2026-03-31T20:37:13.096 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-31T20:37:12.001750+0000 mon.b [DBG] from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch 2026-03-31T20:37:13.096 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-31T20:37:12.005337+0000 mon.b [DBG] from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch 2026-03-31T20:37:13.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2697: test_mon_tell: expect_false ceph tell mon.foo version 2026-03-31T20:37:13.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:37:13.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph tell mon.foo version 2026-03-31T20:37:13.155 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: problem getting command descriptions from mon.foo 2026-03-31T20:37:13.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:37:13.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2699: test_mon_tell: test_tell_output_file mon.0 2026-03-31T20:37:13.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:879: test_tell_output_file: name=mon.0 2026-03-31T20:37:13.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:880: test_tell_output_file: shift 2026-03-31T20:37:13.158 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:884: test_tell_output_file: ceph tell --format=json --daemon-output-file=/tmp/foo mon.0 version 2026-03-31T20:37:13.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:884: test_tell_output_file: J=' 2026-03-31T20:37:13.234 INFO:tasks.workunit.client.0.vm03.stderr:{ 2026-03-31T20:37:13.234 INFO:tasks.workunit.client.0.vm03.stderr: "path": "/tmp/foo", 2026-03-31T20:37:13.234 INFO:tasks.workunit.client.0.vm03.stderr: "result": 0, 2026-03-31T20:37:13.234 INFO:tasks.workunit.client.0.vm03.stderr: "output": "", 2026-03-31T20:37:13.234 INFO:tasks.workunit.client.0.vm03.stderr: "len": 79 2026-03-31T20:37:13.234 INFO:tasks.workunit.client.0.vm03.stderr:}' 2026-03-31T20:37:13.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:885: test_tell_output_file: expect_true jq -e '.path == "/tmp/foo"' 2026-03-31T20:37:13.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:13.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq -e '.path == "/tmp/foo"' 2026-03-31T20:37:13.244 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:37:13.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:13.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:886: test_tell_output_file: expect_true test -e /tmp/foo 2026-03-31T20:37:13.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:13.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: test -e /tmp/foo 2026-03-31T20:37:13.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:13.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:888: test_tell_output_file: expect_true sed 2q1 2026-03-31T20:37:13.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:13.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: sed 2q1 2026-03-31T20:37:13.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:13.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:889: test_tell_output_file: expect_true jq -e '.version | length > 0' 2026-03-31T20:37:13.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:13.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq -e '.version | length > 0' 2026-03-31T20:37:13.255 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:37:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:890: test_tell_output_file: sudo rm -f /tmp/foo 2026-03-31T20:37:13.262 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:892: test_tell_output_file: ceph tell --format=json-pretty --daemon-output-file=/tmp/foo mon.0 version 2026-03-31T20:37:13.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:892: test_tell_output_file: J=' 2026-03-31T20:37:13.344 INFO:tasks.workunit.client.0.vm03.stderr:{ 2026-03-31T20:37:13.344 INFO:tasks.workunit.client.0.vm03.stderr: "path": "/tmp/foo", 2026-03-31T20:37:13.344 INFO:tasks.workunit.client.0.vm03.stderr: "result": 0, 2026-03-31T20:37:13.344 INFO:tasks.workunit.client.0.vm03.stderr: "output": "", 2026-03-31T20:37:13.344 INFO:tasks.workunit.client.0.vm03.stderr: "len": 99 2026-03-31T20:37:13.344 INFO:tasks.workunit.client.0.vm03.stderr:}' 2026-03-31T20:37:13.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:893: test_tell_output_file: expect_true jq -e '.path == "/tmp/foo"' 2026-03-31T20:37:13.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:13.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq -e '.path == "/tmp/foo"' 2026-03-31T20:37:13.352 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:37:13.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:13.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:894: test_tell_output_file: expect_true test -e /tmp/foo 2026-03-31T20:37:13.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:13.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: test -e /tmp/foo 2026-03-31T20:37:13.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:13.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:896: test_tell_output_file: expect_false sed 2q1 2026-03-31T20:37:13.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:37:13.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: sed 2q1 2026-03-31T20:37:13.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:37:13.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:897: test_tell_output_file: expect_true jq -e '.version | length > 0' 2026-03-31T20:37:13.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:13.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq -e '.version | length > 0' 2026-03-31T20:37:13.363 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:37:13.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:13.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:898: test_tell_output_file: sudo rm -f /tmp/foo 2026-03-31T20:37:13.368 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:901: test_tell_output_file: ceph tell --format=json --daemon-output-file=:tmp: mon.0 version 2026-03-31T20:37:13.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:901: test_tell_output_file: J=' 2026-03-31T20:37:13.443 INFO:tasks.workunit.client.0.vm03.stderr:{ 2026-03-31T20:37:13.443 INFO:tasks.workunit.client.0.vm03.stderr: "path": "/tmp/ceph-mon.a.R9n6Pf", 2026-03-31T20:37:13.443 INFO:tasks.workunit.client.0.vm03.stderr: "result": 0, 2026-03-31T20:37:13.443 INFO:tasks.workunit.client.0.vm03.stderr: "output": "", 2026-03-31T20:37:13.443 INFO:tasks.workunit.client.0.vm03.stderr: "len": 79 2026-03-31T20:37:13.443 INFO:tasks.workunit.client.0.vm03.stderr:}' 2026-03-31T20:37:13.443 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:902: test_tell_output_file: jq -r .path 2026-03-31T20:37:13.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:902: test_tell_output_file: path=/tmp/ceph-mon.a.R9n6Pf 2026-03-31T20:37:13.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:903: test_tell_output_file: expect_true test -e /tmp/ceph-mon.a.R9n6Pf 2026-03-31T20:37:13.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:13.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: test -e /tmp/ceph-mon.a.R9n6Pf 2026-03-31T20:37:13.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:13.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:905: test_tell_output_file: expect_true sudo sh -c 'sed '\''2q1'\'' < "/tmp/ceph-mon.a.R9n6Pf" > /dev/null' 2026-03-31T20:37:13.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:13.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: sudo sh -c 'sed '\''2q1'\'' < "/tmp/ceph-mon.a.R9n6Pf" > /dev/null' 2026-03-31T20:37:13.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:13.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:906: test_tell_output_file: expect_true sudo sudo sh -c 'jq -e '\''.version | length > 0'\'' < "/tmp/ceph-mon.a.R9n6Pf"' 2026-03-31T20:37:13.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:13.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: sudo sudo sh -c 'jq -e '\''.version | length > 0'\'' < "/tmp/ceph-mon.a.R9n6Pf"' 2026-03-31T20:37:13.475 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:37:13.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:13.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:907: test_tell_output_file: sudo rm -f /tmp/ceph-mon.a.R9n6Pf 2026-03-31T20:37:13.481 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:909: test_tell_output_file: ceph tell --format=json-pretty --daemon-output-file=:tmp: mon.0 version 2026-03-31T20:37:13.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:909: test_tell_output_file: J=' 2026-03-31T20:37:13.558 INFO:tasks.workunit.client.0.vm03.stderr:{ 2026-03-31T20:37:13.558 INFO:tasks.workunit.client.0.vm03.stderr: "path": "/tmp/ceph-mon.a.O565aM", 2026-03-31T20:37:13.558 INFO:tasks.workunit.client.0.vm03.stderr: "result": 0, 2026-03-31T20:37:13.558 INFO:tasks.workunit.client.0.vm03.stderr: "output": "", 2026-03-31T20:37:13.558 INFO:tasks.workunit.client.0.vm03.stderr: "len": 99 2026-03-31T20:37:13.558 INFO:tasks.workunit.client.0.vm03.stderr:}' 2026-03-31T20:37:13.558 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:910: test_tell_output_file: jq -r .path 2026-03-31T20:37:13.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:910: test_tell_output_file: path=/tmp/ceph-mon.a.O565aM 2026-03-31T20:37:13.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:911: test_tell_output_file: expect_true test -e /tmp/ceph-mon.a.O565aM 2026-03-31T20:37:13.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:13.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: test -e /tmp/ceph-mon.a.O565aM 2026-03-31T20:37:13.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:13.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:913: test_tell_output_file: expect_false sudo sh -c 'sed '\''2q1'\'' < "/tmp/ceph-mon.a.O565aM" > /dev/null' 2026-03-31T20:37:13.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:37:13.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: sudo sh -c 'sed '\''2q1'\'' < "/tmp/ceph-mon.a.O565aM" > /dev/null' 2026-03-31T20:37:13.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:37:13.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:914: test_tell_output_file: expect_true sudo sh -c 'jq -e '\''.version | length > 0'\'' < "/tmp/ceph-mon.a.O565aM"' 2026-03-31T20:37:13.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:13.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: sudo sh -c 'jq -e '\''.version | length > 0'\'' < "/tmp/ceph-mon.a.O565aM"' 2026-03-31T20:37:13.586 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:37:13.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:13.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:915: test_tell_output_file: sudo rm -f /tmp/ceph-mon.a.O565aM 2026-03-31T20:37:13.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:37:13.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mon_ping 2026-03-31T20:37:13.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2704: test_mon_ping: ceph ping mon.a 2026-03-31T20:37:13.855 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:37:13.855 INFO:tasks.workunit.client.0.vm03.stdout: "health": { 2026-03-31T20:37:13.855 INFO:tasks.workunit.client.0.vm03.stdout: "status": "HEALTH_WARN", 2026-03-31T20:37:13.855 INFO:tasks.workunit.client.0.vm03.stdout: "checks": { 2026-03-31T20:37:13.855 INFO:tasks.workunit.client.0.vm03.stdout: "PG_AVAILABILITY": { 2026-03-31T20:37:13.855 INFO:tasks.workunit.client.0.vm03.stdout: "severity": "HEALTH_WARN", 2026-03-31T20:37:13.855 INFO:tasks.workunit.client.0.vm03.stdout: "summary": { 2026-03-31T20:37:13.855 INFO:tasks.workunit.client.0.vm03.stdout: "message": "Reduced data availability: 3 pgs inactive, 4 pgs peering", 2026-03-31T20:37:13.855 INFO:tasks.workunit.client.0.vm03.stdout: "count": 7 2026-03-31T20:37:13.855 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.855 INFO:tasks.workunit.client.0.vm03.stdout: "muted": false 2026-03-31T20:37:13.855 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.855 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.855 INFO:tasks.workunit.client.0.vm03.stdout: "mutes": [] 2026-03-31T20:37:13.855 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "mon_status": { 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "name": "a", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 0, 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "state": "leader", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "election_epoch": 34, 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "uptime": 950108, 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "quorum": [ 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: 0, 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: 1, 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: 2 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "quorum_age": 299, 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "features": { 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "required_con": "2451647607914708996", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "required_mon": [ 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "kraken", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "luminous", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "mimic", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "osdmap-prune", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "nautilus", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "octopus", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "pacific", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "elector-pinging", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "quincy", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "reef", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "squid", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "tentacle" 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "quorum_con": "4541880224203014143", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "quorum_mon": [ 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "kraken", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "luminous", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "mimic", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "osdmap-prune", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "nautilus", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "octopus", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "pacific", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "elector-pinging", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "quincy", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "reef", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "squid", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "tentacle" 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "outside_quorum": [], 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "extra_probe_peers": [], 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "sync_provider": [], 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "monmap": { 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "epoch": 6, 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "fsid": "a4a0ca01-ae82-443e-a7c7-50605716689a", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "modified": "2026-03-31T20:25:41.765560Z", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "created": "2026-03-31T20:21:18.374590Z", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "min_mon_release": 20, 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "min_mon_release_name": "tentacle", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "election_strategy": 1, 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "disallowed_leaders": "", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "stretch_mode": false, 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "tiebreaker_mon": "", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "removed_ranks": "", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "features": { 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "persistent": [ 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "kraken", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "luminous", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "mimic", 2026-03-31T20:37:13.856 INFO:tasks.workunit.client.0.vm03.stdout: "osdmap-prune", 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "nautilus", 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "octopus", 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "pacific", 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "elector-pinging", 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "quincy", 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "reef", 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "squid", 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "tentacle" 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "optional": [] 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "mons": [ 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 0, 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "name": "a", 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "public_addrs": { 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3300", 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6789", 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6789/0", 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "public_addr": "192.168.123.103:6789/0", 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "priority": 0, 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "weight": 0, 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": "{}" 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 1, 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "name": "b", 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "public_addrs": { 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3301", 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6790", 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6790/0", 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "public_addr": "192.168.123.103:6790/0", 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "priority": 0, 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "weight": 0, 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": "{}" 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 2, 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "name": "c", 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "public_addrs": { 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3302", 2026-03-31T20:37:13.857 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6791", 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6791/0", 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: "public_addr": "192.168.123.103:6791/0", 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: "priority": 0, 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: "weight": 0, 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": "{}" 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: "feature_map": { 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: "mon": [ 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: "features": "0x3f07fffffffdffff", 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: "release": "squid", 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: "num": 1 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: "osd": [ 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: "features": "0x3f07fffffffdffff", 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: "release": "squid", 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: "num": 1 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: "client": [ 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: "features": "0x3f07fffffffdffff", 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: "release": "squid", 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: "num": 3 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: "stretch_mode": false 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:37:13.858 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:37:13.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2705: test_mon_ping: ceph ping mon.b 2026-03-31T20:37:13.919 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:37:13.919 INFO:tasks.workunit.client.0.vm03.stdout: "health": { 2026-03-31T20:37:13.919 INFO:tasks.workunit.client.0.vm03.stdout: "status": "HEALTH_WARN", 2026-03-31T20:37:13.919 INFO:tasks.workunit.client.0.vm03.stdout: "checks": { 2026-03-31T20:37:13.919 INFO:tasks.workunit.client.0.vm03.stdout: "PG_AVAILABILITY": { 2026-03-31T20:37:13.919 INFO:tasks.workunit.client.0.vm03.stdout: "severity": "HEALTH_WARN", 2026-03-31T20:37:13.919 INFO:tasks.workunit.client.0.vm03.stdout: "summary": { 2026-03-31T20:37:13.919 INFO:tasks.workunit.client.0.vm03.stdout: "message": "Reduced data availability: 3 pgs inactive, 4 pgs peering", 2026-03-31T20:37:13.919 INFO:tasks.workunit.client.0.vm03.stdout: "count": 7 2026-03-31T20:37:13.919 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.919 INFO:tasks.workunit.client.0.vm03.stdout: "muted": false 2026-03-31T20:37:13.919 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.919 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.919 INFO:tasks.workunit.client.0.vm03.stdout: "mutes": [] 2026-03-31T20:37:13.919 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.919 INFO:tasks.workunit.client.0.vm03.stdout: "mon_status": { 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "name": "b", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 1, 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "state": "peon", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "election_epoch": 34, 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "uptime": 950168, 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "quorum": [ 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: 0, 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: 1, 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: 2 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "quorum_age": 299, 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "features": { 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "required_con": "2451647607914708996", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "required_mon": [ 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "kraken", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "luminous", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "mimic", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "osdmap-prune", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "nautilus", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "octopus", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "pacific", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "elector-pinging", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "quincy", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "reef", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "squid", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "tentacle" 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "quorum_con": "4541880224203014143", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "quorum_mon": [ 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "kraken", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "luminous", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "mimic", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "osdmap-prune", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "nautilus", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "octopus", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "pacific", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "elector-pinging", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "quincy", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "reef", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "squid", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "tentacle" 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "outside_quorum": [], 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "extra_probe_peers": [], 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "sync_provider": [], 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "monmap": { 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "epoch": 6, 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "fsid": "a4a0ca01-ae82-443e-a7c7-50605716689a", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "modified": "2026-03-31T20:25:41.765560Z", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "created": "2026-03-31T20:21:18.374590Z", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "min_mon_release": 20, 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "min_mon_release_name": "tentacle", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "election_strategy": 1, 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "disallowed_leaders": "", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "stretch_mode": false, 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "tiebreaker_mon": "", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "removed_ranks": "", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "features": { 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "persistent": [ 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "kraken", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "luminous", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "mimic", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "osdmap-prune", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "nautilus", 2026-03-31T20:37:13.920 INFO:tasks.workunit.client.0.vm03.stdout: "octopus", 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "pacific", 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "elector-pinging", 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "quincy", 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "reef", 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "squid", 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "tentacle" 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "optional": [] 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "mons": [ 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 0, 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "name": "a", 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "public_addrs": { 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3300", 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6789", 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6789/0", 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "public_addr": "192.168.123.103:6789/0", 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "priority": 0, 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "weight": 0, 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": "{}" 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 1, 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "name": "b", 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "public_addrs": { 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3301", 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6790", 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6790/0", 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "public_addr": "192.168.123.103:6790/0", 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "priority": 0, 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "weight": 0, 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": "{}" 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 2, 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "name": "c", 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "public_addrs": { 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3302", 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.921 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6791", 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6791/0", 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: "public_addr": "192.168.123.103:6791/0", 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: "priority": 0, 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: "weight": 0, 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": "{}" 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: "feature_map": { 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: "mon": [ 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: "features": "0x3f07fffffffdffff", 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: "release": "squid", 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: "num": 1 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: "osd": [ 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: "features": "0x3f07fffffffdffff", 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: "release": "squid", 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: "num": 1 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: "client": [ 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: "features": "0x3f07fffffffdffff", 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: "release": "squid", 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: "num": 1 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: "mgr": [ 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: "features": "0x3f07fffffffdffff", 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: "release": "squid", 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: "num": 1 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: "stretch_mode": false 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:37:13.922 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:37:13.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2706: test_mon_ping: expect_false ceph ping mon.foo 2026-03-31T20:37:13.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:37:13.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph ping mon.foo 2026-03-31T20:37:13.979 INFO:tasks.workunit.client.0.vm03.stderr:[errno 2] RADOS object not found (error calling ping_monitor) 2026-03-31T20:37:13.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:37:13.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2708: test_mon_ping: ceph ping 'mon.*' 2026-03-31T20:37:14.038 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:37:14.037+0000 7ff08dffb640 0 ms_deliver_dispatch: unhandled message 0x7ff0a001a460 mon_map magic: 0 from mon.0 v2:192.168.123.103:3300/0 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:37:14.037+0000 7ff08dffb640 0 ms_deliver_dispatch: unhandled message 0x7ff0a00098e0 mon_map magic: 0 from mon.1 v2:192.168.123.103:3301/0 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout:mon.a 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "health": { 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "status": "HEALTH_WARN", 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "checks": { 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "PG_AVAILABILITY": { 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "severity": "HEALTH_WARN", 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "summary": { 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "message": "Reduced data availability: 3 pgs inactive, 4 pgs peering", 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "count": 7 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "muted": false 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "mutes": [] 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "mon_status": { 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "name": "a", 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 0, 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "state": "leader", 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "election_epoch": 34, 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "uptime": 950292, 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "quorum": [ 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: 0, 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: 1, 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: 2 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "quorum_age": 299, 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "features": { 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "required_con": "2451647607914708996", 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "required_mon": [ 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "kraken", 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "luminous", 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "mimic", 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "osdmap-prune", 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "nautilus", 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "octopus", 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "pacific", 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "elector-pinging", 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "quincy", 2026-03-31T20:37:14.039 INFO:tasks.workunit.client.0.vm03.stdout: "reef", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "squid", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "tentacle" 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "quorum_con": "4541880224203014143", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "quorum_mon": [ 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "kraken", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "luminous", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "mimic", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "osdmap-prune", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "nautilus", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "octopus", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "pacific", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "elector-pinging", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "quincy", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "reef", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "squid", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "tentacle" 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "outside_quorum": [], 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "extra_probe_peers": [], 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "sync_provider": [], 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "monmap": { 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "epoch": 6, 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "fsid": "a4a0ca01-ae82-443e-a7c7-50605716689a", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "modified": "2026-03-31T20:25:41.765560Z", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "created": "2026-03-31T20:21:18.374590Z", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "min_mon_release": 20, 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "min_mon_release_name": "tentacle", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "election_strategy": 1, 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "disallowed_leaders": "", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "stretch_mode": false, 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "tiebreaker_mon": "", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "removed_ranks": "", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "features": { 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "persistent": [ 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "kraken", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "luminous", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "mimic", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "osdmap-prune", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "nautilus", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "octopus", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "pacific", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "elector-pinging", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "quincy", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "reef", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "squid", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "tentacle" 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "optional": [] 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "mons": [ 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 0, 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "name": "a", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "public_addrs": { 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3300", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6789", 2026-03-31T20:37:14.040 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6789/0", 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "public_addr": "192.168.123.103:6789/0", 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "priority": 0, 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "weight": 0, 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": "{}" 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 1, 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "name": "b", 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "public_addrs": { 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3301", 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6790", 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6790/0", 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "public_addr": "192.168.123.103:6790/0", 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "priority": 0, 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "weight": 0, 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": "{}" 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 2, 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "name": "c", 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "public_addrs": { 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3302", 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6791", 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6791/0", 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "public_addr": "192.168.123.103:6791/0", 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "priority": 0, 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "weight": 0, 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": "{}" 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "feature_map": { 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "mon": [ 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "features": "0x3f07fffffffdffff", 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "release": "squid", 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "num": 1 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: "osd": [ 2026-03-31T20:37:14.041 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "features": "0x3f07fffffffdffff", 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "release": "squid", 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "num": 1 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "client": [ 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "features": "0x3f07fffffffdffff", 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "release": "squid", 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "num": 4 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "stretch_mode": false 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:37:14.041+0000 7ff08dffb640 0 ms_deliver_dispatch: unhandled message 0x7ff0a00098e0 mon_map magic: 0 from mon.2 v2:192.168.123.103:3302/0 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout:mon.b 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "health": { 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "status": "HEALTH_WARN", 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "checks": { 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "PG_AVAILABILITY": { 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "severity": "HEALTH_WARN", 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "summary": { 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "message": "Reduced data availability: 3 pgs inactive, 4 pgs peering", 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "count": 7 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "muted": false 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "mutes": [] 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "mon_status": { 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "name": "b", 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 1, 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "state": "peon", 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "election_epoch": 34, 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "uptime": 950288, 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "quorum": [ 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: 0, 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: 1, 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: 2 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "quorum_age": 299, 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "features": { 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "required_con": "2451647607914708996", 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "required_mon": [ 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "kraken", 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "luminous", 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "mimic", 2026-03-31T20:37:14.042 INFO:tasks.workunit.client.0.vm03.stdout: "osdmap-prune", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "nautilus", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "octopus", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "pacific", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "elector-pinging", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "quincy", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "reef", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "squid", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "tentacle" 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "quorum_con": "4541880224203014143", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "quorum_mon": [ 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "kraken", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "luminous", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "mimic", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "osdmap-prune", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "nautilus", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "octopus", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "pacific", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "elector-pinging", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "quincy", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "reef", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "squid", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "tentacle" 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "outside_quorum": [], 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "extra_probe_peers": [], 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "sync_provider": [], 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "monmap": { 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "epoch": 6, 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "fsid": "a4a0ca01-ae82-443e-a7c7-50605716689a", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "modified": "2026-03-31T20:25:41.765560Z", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "created": "2026-03-31T20:21:18.374590Z", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "min_mon_release": 20, 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "min_mon_release_name": "tentacle", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "election_strategy": 1, 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "disallowed_leaders": "", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "stretch_mode": false, 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "tiebreaker_mon": "", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "removed_ranks": "", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "features": { 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "persistent": [ 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "kraken", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "luminous", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "mimic", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "osdmap-prune", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "nautilus", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "octopus", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "pacific", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "elector-pinging", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "quincy", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "reef", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "squid", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "tentacle" 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "optional": [] 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "mons": [ 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 0, 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "name": "a", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "public_addrs": { 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3300", 2026-03-31T20:37:14.043 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6789", 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6789/0", 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "public_addr": "192.168.123.103:6789/0", 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "priority": 0, 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "weight": 0, 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": "{}" 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 1, 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "name": "b", 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "public_addrs": { 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3301", 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6790", 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6790/0", 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "public_addr": "192.168.123.103:6790/0", 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "priority": 0, 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "weight": 0, 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": "{}" 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 2, 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "name": "c", 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "public_addrs": { 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3302", 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6791", 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6791/0", 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "public_addr": "192.168.123.103:6791/0", 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "priority": 0, 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "weight": 0, 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": "{}" 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "feature_map": { 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: "mon": [ 2026-03-31T20:37:14.044 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "features": "0x3f07fffffffdffff", 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "release": "squid", 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "num": 1 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "osd": [ 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "features": "0x3f07fffffffdffff", 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "release": "squid", 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "num": 1 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "client": [ 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "features": "0x3f07fffffffdffff", 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "release": "squid", 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "num": 1 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "mgr": [ 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "features": "0x3f07fffffffdffff", 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "release": "squid", 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "num": 1 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "stretch_mode": false 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout:mon.c 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "health": { 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "status": "HEALTH_WARN", 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "checks": { 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "PG_AVAILABILITY": { 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "severity": "HEALTH_WARN", 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "summary": { 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "message": "Reduced data availability: 3 pgs inactive, 4 pgs peering", 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "count": 7 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "muted": false 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "mutes": [] 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "mon_status": { 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "name": "c", 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 2, 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "state": "peon", 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "election_epoch": 34, 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "uptime": 950304, 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "quorum": [ 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: 0, 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: 1, 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: 2 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "quorum_age": 299, 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "features": { 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "required_con": "2451647607914708996", 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "required_mon": [ 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "kraken", 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "luminous", 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "mimic", 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "osdmap-prune", 2026-03-31T20:37:14.045 INFO:tasks.workunit.client.0.vm03.stdout: "nautilus", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "octopus", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "pacific", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "elector-pinging", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "quincy", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "reef", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "squid", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "tentacle" 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "quorum_con": "4541880224203014143", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "quorum_mon": [ 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "kraken", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "luminous", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "mimic", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "osdmap-prune", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "nautilus", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "octopus", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "pacific", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "elector-pinging", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "quincy", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "reef", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "squid", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "tentacle" 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "outside_quorum": [], 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "extra_probe_peers": [], 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "sync_provider": [], 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "monmap": { 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "epoch": 6, 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "fsid": "a4a0ca01-ae82-443e-a7c7-50605716689a", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "modified": "2026-03-31T20:25:41.765560Z", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "created": "2026-03-31T20:21:18.374590Z", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "min_mon_release": 20, 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "min_mon_release_name": "tentacle", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "election_strategy": 1, 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "disallowed_leaders": "", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "stretch_mode": false, 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "tiebreaker_mon": "", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "removed_ranks": "", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "features": { 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "persistent": [ 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "kraken", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "luminous", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "mimic", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "osdmap-prune", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "nautilus", 2026-03-31T20:37:14.046 INFO:tasks.workunit.client.0.vm03.stdout: "octopus", 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "pacific", 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "elector-pinging", 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "quincy", 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "reef", 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "squid", 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "tentacle" 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "optional": [] 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "mons": [ 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 0, 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "name": "a", 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "public_addrs": { 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3300", 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6789", 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6789/0", 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "public_addr": "192.168.123.103:6789/0", 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "priority": 0, 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "weight": 0, 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": "{}" 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 1, 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "name": "b", 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "public_addrs": { 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3301", 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6790", 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6790/0", 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "public_addr": "192.168.123.103:6790/0", 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "priority": 0, 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "weight": 0, 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": "{}" 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.047 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "rank": 2, 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "name": "c", 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "public_addrs": { 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "addrvec": [ 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v2", 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:3302", 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "type": "v1", 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6791", 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "nonce": 0 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "addr": "192.168.123.103:6791/0", 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "public_addr": "192.168.123.103:6791/0", 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "priority": 0, 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "weight": 0, 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "crush_location": "{}" 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "feature_map": { 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "mon": [ 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "features": "0x3f07fffffffdffff", 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "release": "squid", 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "num": 1 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "osd": [ 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "features": "0x3f07fffffffdffff", 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "release": "squid", 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "num": 1 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "client": [ 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "features": "0x3f07fffffffdffff", 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "release": "squid", 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "num": 3 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: "stretch_mode": false 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:37:14.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:37:14.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mon_deprecated_commands 2026-03-31T20:37:14.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2719: test_mon_deprecated_commands: ceph tell 'mon.*' injectargs --mon-debug-deprecated-as-obsolete 2026-03-31T20:37:14.338 INFO:tasks.workunit.client.0.vm03.stdout:mon.a: {} 2026-03-31T20:37:14.346 INFO:tasks.workunit.client.0.vm03.stdout:mon.b: {} 2026-03-31T20:37:14.354 INFO:tasks.workunit.client.0.vm03.stdout:mon.c: {} 2026-03-31T20:37:14.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2720: test_mon_deprecated_commands: expect_false ceph config-key list 2026-03-31T20:37:14.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2721: test_mon_deprecated_commands: check_response '\(EOPNOTSUPP\|ENOTSUP\): command is obsolete' 2026-03-31T20:37:14.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='\(EOPNOTSUPP\|ENOTSUP\): command is obsolete' 2026-03-31T20:37:14.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode= 2026-03-31T20:37:14.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode= 2026-03-31T20:37:14.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' '' -a '!=' ']' 2026-03-31T20:37:14.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- '\(EOPNOTSUPP\|ENOTSUP\): command is obsolete' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:37:14.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2723: test_mon_deprecated_commands: ceph tell 'mon.*' injectargs --no-mon-debug-deprecated-as-obsolete 2026-03-31T20:37:14.623 INFO:tasks.workunit.client.0.vm03.stdout:mon.a: {} 2026-03-31T20:37:14.630 INFO:tasks.workunit.client.0.vm03.stdout:mon.b: {} 2026-03-31T20:37:14.638 INFO:tasks.workunit.client.0.vm03.stdout:mon.c: {} 2026-03-31T20:37:14.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:37:14.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mon_caps 2026-03-31T20:37:14.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:715: test_mon_caps: ceph-authtool --create-keyring /tmp/cephtool.sYl/ceph.client.bug.keyring 2026-03-31T20:37:14.867 INFO:tasks.workunit.client.0.vm03.stdout:creating /tmp/cephtool.sYl/ceph.client.bug.keyring 2026-03-31T20:37:14.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:716: test_mon_caps: chmod +r /tmp/cephtool.sYl/ceph.client.bug.keyring 2026-03-31T20:37:14.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:717: test_mon_caps: ceph-authtool /tmp/cephtool.sYl/ceph.client.bug.keyring -n client.bug --gen-key 2026-03-31T20:37:14.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:718: test_mon_caps: ceph auth add client.bug -i /tmp/cephtool.sYl/ceph.client.bug.keyring 2026-03-31T20:37:15.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:721: test_mon_caps: rados lspools --no-mon-config --keyring /tmp/cephtool.sYl/ceph.client.bug.keyring -n client.bug 2026-03-31T20:37:15.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:721: test_mon_caps: true 2026-03-31T20:37:15.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:722: test_mon_caps: cat /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:37:15.173 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-31T20:37:15.169+0000 7fd0cffff640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [2] 2026-03-31T20:37:15.173 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-31T20:37:15.169+0000 7fd0dcc69640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [2] 2026-03-31T20:37:15.173 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-31T20:37:15.169+0000 7fd0dd46a640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [2] 2026-03-31T20:37:15.173 INFO:tasks.workunit.client.0.vm03.stdout:couldn't connect to cluster: (13) Permission denied 2026-03-31T20:37:15.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:723: test_mon_caps: check_response 'Permission denied' 2026-03-31T20:37:15.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='Permission denied' 2026-03-31T20:37:15.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode= 2026-03-31T20:37:15.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode= 2026-03-31T20:37:15.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' '' -a '!=' ']' 2026-03-31T20:37:15.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'Permission denied' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:37:15.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:725: test_mon_caps: rm -rf /tmp/cephtool.sYl/ceph.client.bug.keyring 2026-03-31T20:37:15.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:726: test_mon_caps: ceph auth del client.bug 2026-03-31T20:37:15.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:727: test_mon_caps: ceph-authtool --create-keyring /tmp/cephtool.sYl/ceph.client.bug.keyring 2026-03-31T20:37:15.463 INFO:tasks.workunit.client.0.vm03.stdout:creating /tmp/cephtool.sYl/ceph.client.bug.keyring 2026-03-31T20:37:15.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:728: test_mon_caps: chmod +r /tmp/cephtool.sYl/ceph.client.bug.keyring 2026-03-31T20:37:15.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:729: test_mon_caps: ceph-authtool /tmp/cephtool.sYl/ceph.client.bug.keyring -n client.bug --gen-key 2026-03-31T20:37:15.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:730: test_mon_caps: ceph-authtool -n client.bug --cap mon '' /tmp/cephtool.sYl/ceph.client.bug.keyring 2026-03-31T20:37:15.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:731: test_mon_caps: ceph auth add client.bug -i /tmp/cephtool.sYl/ceph.client.bug.keyring 2026-03-31T20:37:15.664 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:37:15.665+0000 7f8708384640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 auth entities have invalid capabilities (AUTH_BAD_CAPS) 2026-03-31T20:37:15.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:732: test_mon_caps: rados lspools --no-mon-config --keyring /tmp/cephtool.sYl/ceph.client.bug.keyring -n client.bug 2026-03-31T20:37:15.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:732: test_mon_caps: true 2026-03-31T20:37:15.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:733: test_mon_caps: check_response 'Permission denied' 2026-03-31T20:37:15.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='Permission denied' 2026-03-31T20:37:15.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode= 2026-03-31T20:37:15.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode= 2026-03-31T20:37:15.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' '' -a '!=' ']' 2026-03-31T20:37:15.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'Permission denied' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:37:15.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:37:15.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mon_cephdf_commands 2026-03-31T20:37:15.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2732: test_mon_cephdf_commands: ceph osd pool create cephdf_for_test 1 1 replicated 2026-03-31T20:37:17.013 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cephdf_for_test' already exists 2026-03-31T20:37:17.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2733: test_mon_cephdf_commands: ceph osd pool application enable cephdf_for_test rados 2026-03-31T20:37:18.984 INFO:tasks.workunit.client.0.vm03.stderr:enabled application 'rados' on pool 'cephdf_for_test' 2026-03-31T20:37:18.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2734: test_mon_cephdf_commands: ceph osd pool set cephdf_for_test size 2 2026-03-31T20:37:21.005 INFO:tasks.workunit.client.0.vm03.stderr:set pool 39 size to 2 2026-03-31T20:37:21.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2736: test_mon_cephdf_commands: dd if=/dev/zero of=./cephdf_for_test bs=4k count=1 2026-03-31T20:37:21.018 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records in 2026-03-31T20:37:21.018 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records out 2026-03-31T20:37:21.018 INFO:tasks.workunit.client.0.vm03.stderr:4096 bytes (4.1 kB, 4.0 KiB) copied, 6.2877e-05 s, 65.1 MB/s 2026-03-31T20:37:21.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2737: test_mon_cephdf_commands: rados put cephdf_for_test cephdf_for_test -p cephdf_for_test 2026-03-31T20:37:21.042 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2740: test_mon_cephdf_commands: seq 1 10 2026-03-31T20:37:21.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2740: test_mon_cephdf_commands: for i in `seq 1 10` 2026-03-31T20:37:21.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2741: test_mon_cephdf_commands: rados -p cephdf_for_test ls - 2026-03-31T20:37:21.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2741: test_mon_cephdf_commands: grep -q cephdf_for_test 2026-03-31T20:37:21.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2741: test_mon_cephdf_commands: break 2026-03-31T20:37:21.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2746: test_mon_cephdf_commands: flush_pg_stats 2026-03-31T20:37:21.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2258: flush_pg_stats: local timeout=300 2026-03-31T20:37:21.064 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2260: flush_pg_stats: ceph osd ls 2026-03-31T20:37:21.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2260: flush_pg_stats: ids='0 2026-03-31T20:37:21.273 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-31T20:37:21.273 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-31T20:37:21.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2261: flush_pg_stats: seqs= 2026-03-31T20:37:21.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:37:21.273 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.0 flush_pg_stats 2026-03-31T20:37:21.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=949187772628 2026-03-31T20:37:21.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 949187772628 2026-03-31T20:37:21.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-949187772628' 2026-03-31T20:37:21.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:37:21.353 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.1 flush_pg_stats 2026-03-31T20:37:21.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=34359738580 2026-03-31T20:37:21.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 34359738580 2026-03-31T20:37:21.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-949187772628 1-34359738580' 2026-03-31T20:37:21.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2262: flush_pg_stats: for osd in $ids 2026-03-31T20:37:21.433 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: timeout 300 ceph tell osd.2 flush_pg_stats 2026-03-31T20:37:21.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2263: flush_pg_stats: seq=34359738580 2026-03-31T20:37:21.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2264: flush_pg_stats: test -z 34359738580 2026-03-31T20:37:21.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2267: flush_pg_stats: seqs=' 0-949187772628 1-34359738580 2-34359738580' 2026-03-31T20:37:21.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:37:21.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 0-949187772628 2026-03-31T20:37:21.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:37:21.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=0 2026-03-31T20:37:21.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 0-949187772628 2026-03-31T20:37:21.515 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:37:21.516 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 949187772628 2026-03-31T20:37:21.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=949187772628 2026-03-31T20:37:21.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.0 seq 949187772628' 2026-03-31T20:37:21.516 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-31T20:37:21.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 949187772626 -lt 949187772628 2026-03-31T20:37:21.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2275: flush_pg_stats: sleep 1 2026-03-31T20:37:22.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2276: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-31T20:37:22.720 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-31T20:37:22.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 949187772628 -lt 949187772628 2026-03-31T20:37:22.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:37:22.926 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 1-34359738580 2026-03-31T20:37:22.926 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:37:22.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=1 2026-03-31T20:37:22.928 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 1-34359738580 2026-03-31T20:37:22.928 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:37:22.929 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 34359738580 2026-03-31T20:37:22.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=34359738580 2026-03-31T20:37:22.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.1 seq 34359738580' 2026-03-31T20:37:22.929 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-31T20:37:23.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738580 -lt 34359738580 2026-03-31T20:37:23.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2270: flush_pg_stats: for s in $seqs 2026-03-31T20:37:23.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: echo 2-34359738580 2026-03-31T20:37:23.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: cut -d - -f 1 2026-03-31T20:37:23.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2271: flush_pg_stats: osd=2 2026-03-31T20:37:23.142 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: echo 2-34359738580 2026-03-31T20:37:23.142 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: cut -d - -f 2 2026-03-31T20:37:23.143 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 34359738580 2026-03-31T20:37:23.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2272: flush_pg_stats: seq=34359738580 2026-03-31T20:37:23.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2273: flush_pg_stats: echo 'waiting osd.2 seq 34359738580' 2026-03-31T20:37:23.143 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-31T20:37:23.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh:2274: flush_pg_stats: test 34359738580 -lt 34359738580 2026-03-31T20:37:23.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2747: test_mon_cephdf_commands: local 'jq_filter=.pools | .[] | select(.name == "cephdf_for_test") | .stats' 2026-03-31T20:37:23.350 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2748: test_mon_cephdf_commands: ceph df detail --format=json 2026-03-31T20:37:23.350 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2748: test_mon_cephdf_commands: jq '.pools | .[] | select(.name == "cephdf_for_test") | .stats.stored * 2' 2026-03-31T20:37:23.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2748: test_mon_cephdf_commands: stored=8192 2026-03-31T20:37:23.621 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2749: test_mon_cephdf_commands: ceph df detail --format=json 2026-03-31T20:37:23.621 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2749: test_mon_cephdf_commands: jq '.pools | .[] | select(.name == "cephdf_for_test") | .stats.stored_raw' 2026-03-31T20:37:23.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2749: test_mon_cephdf_commands: stored_raw=8192 2026-03-31T20:37:23.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2751: test_mon_cephdf_commands: ceph osd pool delete cephdf_for_test cephdf_for_test --yes-i-really-really-mean-it 2026-03-31T20:37:25.078 INFO:tasks.workunit.client.0.vm03.stderr:pool 'cephdf_for_test' does not exist 2026-03-31T20:37:25.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2752: test_mon_cephdf_commands: rm ./cephdf_for_test 2026-03-31T20:37:25.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2754: test_mon_cephdf_commands: expect_false test 8192 '!=' 8192 2026-03-31T20:37:25.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:37:25.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: test 8192 '!=' 8192 2026-03-31T20:37:25.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:37:25.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:37:25.302 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mon_tell_help_command 2026-03-31T20:37:25.302 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2800: test_mon_tell_help_command: ceph tell mon.a help 2026-03-31T20:37:25.302 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2800: test_mon_tell_help_command: grep sync_force 2026-03-31T20:37:25.372 INFO:tasks.workunit.client.0.vm03.stdout:sync_force [--yes-i-really-mean-it] force sync of and clear monitor store 2026-03-31T20:37:25.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2801: test_mon_tell_help_command: ceph tell mon.a -h 2026-03-31T20:37:25.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2801: test_mon_tell_help_command: grep sync_force 2026-03-31T20:37:25.440 INFO:tasks.workunit.client.0.vm03.stdout:sync_force [--yes-i-really-mean-it] force sync of and clear monitor store 2026-03-31T20:37:25.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2802: test_mon_tell_help_command: ceph tell mon.a config -h 2026-03-31T20:37:25.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2802: test_mon_tell_help_command: grep 'config diff get' 2026-03-31T20:37:25.515 INFO:tasks.workunit.client.0.vm03.stdout:config diff get dump diff get : dump diff of 2026-03-31T20:37:25.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2805: test_mon_tell_help_command: expect_false ceph tell mon.zzz help 2026-03-31T20:37:25.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:37:25.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph tell mon.zzz help 2026-03-31T20:37:25.577 INFO:tasks.workunit.client.0.vm03.stderr:WARN: the service id you provided does not exist. service id should be one of a/b/c. 2026-03-31T20:37:25.577 INFO:tasks.workunit.client.0.vm03.stderr:target mon.zzz doesn't exists, please pass correct target to tell command, such as mon.a/osd.1/mds.a/mgr 2026-03-31T20:37:25.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:37:25.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:37:25.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mon_stdin_stdout 2026-03-31T20:37:25.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2810: test_mon_stdin_stdout: echo foo 2026-03-31T20:37:25.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2810: test_mon_stdin_stdout: ceph config-key set test_key -i - 2026-03-31T20:37:26.084 INFO:tasks.workunit.client.0.vm03.stderr:set test_key 2026-03-31T20:37:26.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2811: test_mon_stdin_stdout: ceph config-key get test_key -o - 2026-03-31T20:37:26.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2811: test_mon_stdin_stdout: grep -c foo 2026-03-31T20:37:26.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2811: test_mon_stdin_stdout: grep -q 1 2026-03-31T20:37:26.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:37:26.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mon_messenger_dump 2026-03-31T20:37:26.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2931: test_mon_messenger_dump: do_messenger_dump_basics_test mon.a 2026-03-31T20:37:26.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2905: do_messenger_dump_basics_test: local target=mon.a 2026-03-31T20:37:26.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2906: do_messenger_dump_basics_test: ceph tell mon.a messenger dump 2026-03-31T20:37:26.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2906: do_messenger_dump_basics_test: expect_true jq --exit-status '.messengers | length > 0' 2026-03-31T20:37:26.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:26.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messengers | length > 0' 2026-03-31T20:37:26.658 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:37:26.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:26.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: ceph tell mon.a messenger dump 2026-03-31T20:37:26.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: jq -r '.messengers[]' 2026-03-31T20:37:26.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: read messenger 2026-03-31T20:37:26.732 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: ceph tell mon.a messenger dump mon all 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: dump='{ 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr: "name": "mon", 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr: "messenger": { 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0, 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr: "my_name": { 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr: "num": 0 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr: "my_addrs": { 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6789", 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr: "listen_sockets": [ 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 28, 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:37:26.808 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 29, 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "dispatch_queue": { 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "length": 0, 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "max_age_ago": "0s" 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "connections_count": 2, 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "connections": [ 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3302", 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6791", 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 0, 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 33, 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 3, 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3302", 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6791", 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "16m", 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "0.348002s", 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "0.347669s", 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 478, 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.809 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3302", 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 3, 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3301", 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6790", 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 0, 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 30, 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 2, 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:37:26.810 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3301", 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6790", 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "16m", 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "0.224001s", 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "0.224888s", 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 1069, 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3301", 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 4, 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "anon_conns": [ 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 0, 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:37:26.811 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 4008, 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "id": 11872, 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 14877, 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3274002791 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "19m", 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "0.00400002s", 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "0.00709064s", 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3274002791 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 4, 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 0, 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 4009, 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:37:26.812 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "id": 11872, 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 14880, 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3274002791 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "19m", 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "0s", 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "0.00278656s", 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3274002791 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 6, 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 0, 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 39, 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 966, 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:37:26.813 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 6663, 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2742811205 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "19m", 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "0.0920005s", 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "0.0930063s", 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 374, 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2742811205 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 1, 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 0, 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 41, 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 33, 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "id": 0, 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 4164, 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:37:26.814 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6804", 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "19m", 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "4s", 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "4s", 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 1070, 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6804", 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 3, 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 0, 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 16, 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 4010, 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "id": 11872, 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 14883, 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3274002791 2026-03-31T20:37:26.815 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "19m", 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "0s", 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "0.000265799s", 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 1235, 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3274002791 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 7, 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 0, 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 43, 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 980, 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 6702, 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1428989468 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "19m", 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "1.06401s", 2026-03-31T20:37:26.816 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "1.06421s", 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 375, 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1428989468 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 2, 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "accepting_conns": [ 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 0, 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 4008, 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "id": 11872, 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 14877, 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3274002791 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "19m", 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "0.00400002s", 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "0.00717819s", 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:37:26.817 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3274002791 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 4, 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 0, 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 4009, 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "id": 11872, 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 14880, 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3274002791 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "19m", 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "0s", 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "0.00287244s", 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:37:26.818 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3274002791 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 6, 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 0, 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 39, 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 966, 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 6663, 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2742811205 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "19m", 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "0.0920005s", 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "0.0930935s", 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 374, 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2742811205 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:37:26.819 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 1, 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 0, 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 41, 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 33, 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "id": 0, 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 4164, 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6804", 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "19m", 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "4s", 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "4s", 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 1070, 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6804", 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:37:26.820 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 3, 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 0, 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 16, 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 4010, 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "id": 11872, 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 14883, 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3274002791 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "19m", 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "0s", 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "0.000343295s", 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 1235, 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3274002791 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 7, 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:37:26.821 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 0, 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 43, 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 980, 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 6702, 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1428989468 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "19m", 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "1.06401s", 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "1.06428s", 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 375, 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1428989468 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 2, 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.822 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "deleted_conns": [ 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 0, 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 4008, 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "id": 11872, 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 14877, 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3274002791 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "19m", 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "0.00400002s", 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "0.00725352s", 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3274002791 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 4, 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.823 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 0, 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 4009, 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "id": 11872, 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 14880, 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3274002791 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "19m", 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "0s", 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "0.00294758s", 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3274002791 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 6, 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "local_connection": [ 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.824 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_NONE", 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 0, 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": true 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 1, 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6789", 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "19m", 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "16m", 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "19m", 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "v1": { 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "state": "NONE", 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown" 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr:}' 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2909: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("messenger")' 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:26.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("messenger")' 2026-03-31T20:37:26.826 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:37:26.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:26.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2910: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("name")' 2026-03-31T20:37:26.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:26.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("name")' 2026-03-31T20:37:26.826 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:37:26.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:26.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2911: do_messenger_dump_basics_test: expect_true jq --arg expected_messenger mon --exit-status ' 2026-03-31T20:37:26.826 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:37:26.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:26.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --arg expected_messenger mon --exit-status ' 2026-03-31T20:37:26.827 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:37:26.835 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:37:26.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:26.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2913: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | type == "object"' 2026-03-31T20:37:26.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:26.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | type == "object"' 2026-03-31T20:37:26.844 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:37:26.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:26.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2914: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | 2026-03-31T20:37:26.844 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:37:26.844 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:37:26.844 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:37:26.844 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:37:26.844 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:37:26.844 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:37:26.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:26.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | 2026-03-31T20:37:26.844 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:37:26.844 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:37:26.844 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:37:26.844 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:37:26.845 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:37:26.845 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:37:26.853 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:37:26.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:26.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: read messenger 2026-03-31T20:37:26.853 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: ceph tell mon.a messenger dump mon-mgrc all 2026-03-31T20:37:26.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: dump='{ 2026-03-31T20:37:26.929 INFO:tasks.workunit.client.0.vm03.stderr: "name": "mon-mgrc", 2026-03-31T20:37:26.929 INFO:tasks.workunit.client.0.vm03.stderr: "messenger": { 2026-03-31T20:37:26.929 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 11228980187884086661, 2026-03-31T20:37:26.929 INFO:tasks.workunit.client.0.vm03.stderr: "my_name": { 2026-03-31T20:37:26.929 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:37:26.929 INFO:tasks.workunit.client.0.vm03.stderr: "num": 0 2026-03-31T20:37:26.929 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.929 INFO:tasks.workunit.client.0.vm03.stderr: "my_addrs": { 2026-03-31T20:37:26.929 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.929 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.929 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:37:26.929 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:37:26.929 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3099256197 2026-03-31T20:37:26.929 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.929 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:37:26.929 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.929 INFO:tasks.workunit.client.0.vm03.stderr: "listen_sockets": [], 2026-03-31T20:37:26.929 INFO:tasks.workunit.client.0.vm03.stderr: "dispatch_queue": { 2026-03-31T20:37:26.929 INFO:tasks.workunit.client.0.vm03.stderr: "length": 0, 2026-03-31T20:37:26.929 INFO:tasks.workunit.client.0.vm03.stderr: "max_age_ago": "0s" 2026-03-31T20:37:26.929 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "connections_count": 2, 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "connections": [ 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 11228980187884086661, 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 42, 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 4, 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mgr", 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "12m", 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "12m", 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "12m", 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 24, 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 9, 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.930 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 293048097 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 11228980187884086661, 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 3, 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mgr", 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 293048097 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "12m", 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "12m", 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "12m", 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 293048097 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": false, 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:37:26.931 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "anon_conns": [], 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "accepting_conns": [], 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "deleted_conns": [ 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 11228980187884086661, 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 3, 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mgr", 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 293048097 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "12m", 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "12m", 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "12m", 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 293048097 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": false, 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:37:26.932 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "local_connection": [ 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_NONE", 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 11228980187884086661, 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": true 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 1, 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3099256197 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "19m", 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "16m", 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "19m", 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "v1": { 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "state": "NONE", 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown" 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:37:26.933 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:37:26.934 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.934 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:37:26.934 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:37:26.934 INFO:tasks.workunit.client.0.vm03.stderr:}' 2026-03-31T20:37:26.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2909: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("messenger")' 2026-03-31T20:37:26.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:26.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("messenger")' 2026-03-31T20:37:26.937 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:37:26.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:26.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2910: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("name")' 2026-03-31T20:37:26.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:26.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("name")' 2026-03-31T20:37:26.947 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:37:26.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:26.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2911: do_messenger_dump_basics_test: expect_true jq --arg expected_messenger mon-mgrc --exit-status ' 2026-03-31T20:37:26.947 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:37:26.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:26.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --arg expected_messenger mon-mgrc --exit-status ' 2026-03-31T20:37:26.947 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:37:26.955 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:37:26.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:26.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2913: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | type == "object"' 2026-03-31T20:37:26.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:26.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | type == "object"' 2026-03-31T20:37:26.964 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:37:26.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:26.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2914: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | 2026-03-31T20:37:26.965 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:37:26.965 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:37:26.965 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:37:26.965 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:37:26.965 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:37:26.965 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:37:26.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:26.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | 2026-03-31T20:37:26.965 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:37:26.965 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:37:26.965 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:37:26.965 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:37:26.965 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:37:26.965 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:37:26.973 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:37:26.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:26.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: read messenger 2026-03-31T20:37:26.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2936: test_mon_messenger_dump: expect_true ceph tell '' messenger dump mon --tcp-info 2026-03-31T20:37:26.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2937: test_mon_messenger_dump: jq 'any(.messenger.connections[].async_connection.tcp_info; has("tcpi_state"))' 2026-03-31T20:37:26.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:26.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: ceph tell '' messenger dump mon --tcp-info 2026-03-31T20:37:27.033 INFO:tasks.workunit.client.0.vm03.stderr:error handling command target: CephName: no . in 2026-03-31T20:37:27.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 1 2026-03-31T20:37:27.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:37:27.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_osd_bench 2026-03-31T20:37:27.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2655: test_osd_bench: local 'args= --osd-bench-duration 10 --osd-bench-max-block-size 2097152 --osd-bench-large-size-max-throughput 10485760 --osd-bench-small-size-max-iops 10' 2026-03-31T20:37:27.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2656: test_osd_bench: ceph tell osd.0 injectargs --osd-bench-duration 10 --osd-bench-max-block-size 2097152 --osd-bench-large-size-max-throughput 10485760 --osd-bench-small-size-max-iops 10 2026-03-31T20:37:27.324 INFO:tasks.workunit.client.0.vm03.stdout:{} 2026-03-31T20:37:27.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2659: test_osd_bench: expect_false ceph tell osd.0 bench 1 2097153 2026-03-31T20:37:27.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:37:27.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph tell osd.0 bench 1 2097153 2026-03-31T20:37:27.402 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: block 'size' values are capped at 2 MiB. If you wish to use a higher value, please adjust 'osd_bench_max_block_size' 2026-03-31T20:37:27.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:37:27.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2661: test_osd_bench: ceph tell osd.0 bench 1 2097152 2026-03-31T20:37:27.495 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:37:27.495 INFO:tasks.workunit.client.0.vm03.stdout: "bytes_written": 1, 2026-03-31T20:37:27.495 INFO:tasks.workunit.client.0.vm03.stdout: "blocksize": 2097152, 2026-03-31T20:37:27.495 INFO:tasks.workunit.client.0.vm03.stdout: "elapsed_sec": 0.0040412800000000004, 2026-03-31T20:37:27.495 INFO:tasks.workunit.client.0.vm03.stdout: "bytes_per_sec": 247.44635363053288, 2026-03-31T20:37:27.495 INFO:tasks.workunit.client.0.vm03.stdout: "iops": 0.00011799161607290882 2026-03-31T20:37:27.495 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:37:27.505 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2668: test_osd_bench: expect_false ceph tell osd.0 bench 409601 4096 2026-03-31T20:37:27.505 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:37:27.505 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph tell osd.0 bench 409601 4096 2026-03-31T20:37:27.571 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: 'count' values greater than 409600 for a block size of 4 KiB, assuming 10 IOPS, for 10 seconds, can cause ill effects on osd. Please adjust 'osd_bench_small_size_max_iops' with a higher value if you wish to use a higher 'count'. 2026-03-31T20:37:27.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:37:27.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2670: test_osd_bench: ceph tell osd.0 bench 409600 4096 2026-03-31T20:37:27.660 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:37:27.660 INFO:tasks.workunit.client.0.vm03.stdout: "bytes_written": 409600, 2026-03-31T20:37:27.660 INFO:tasks.workunit.client.0.vm03.stdout: "blocksize": 4096, 2026-03-31T20:37:27.660 INFO:tasks.workunit.client.0.vm03.stdout: "elapsed_sec": 0.0047046950000000001, 2026-03-31T20:37:27.660 INFO:tasks.workunit.client.0.vm03.stdout: "bytes_per_sec": 87061966.822503895, 2026-03-31T20:37:27.660 INFO:tasks.workunit.client.0.vm03.stdout: "iops": 21255.362993775365 2026-03-31T20:37:27.660 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:37:27.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2678: test_osd_bench: expect_false ceph tell osd.0 bench 104857601 2097152 2026-03-31T20:37:27.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:37:27.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph tell osd.0 bench 104857601 2097152 2026-03-31T20:37:27.736 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: 'count' values greater than 104857600 for a block size of 2 MiB, assuming 10 MiB/s, for 10 seconds, can cause ill effects on osd. Please adjust 'osd_bench_large_size_max_throughput' with a higher value if you wish to use a higher 'count'. 2026-03-31T20:37:27.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:37:27.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2680: test_osd_bench: ceph tell osd.0 bench 104857600 2097152 2026-03-31T20:37:28.256 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:37:28.257 INFO:tasks.workunit.client.0.vm03.stdout: "bytes_written": 104857600, 2026-03-31T20:37:28.257 INFO:tasks.workunit.client.0.vm03.stdout: "blocksize": 2097152, 2026-03-31T20:37:28.257 INFO:tasks.workunit.client.0.vm03.stdout: "elapsed_sec": 0.11815825100000001, 2026-03-31T20:37:28.257 INFO:tasks.workunit.client.0.vm03.stdout: "bytes_per_sec": 887433582.61117113, 2026-03-31T20:37:28.257 INFO:tasks.workunit.client.0.vm03.stdout: "iops": 423.16130762632901 2026-03-31T20:37:28.257 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:37:28.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:37:28.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_osd_negative_filestore_merge_threshold 2026-03-31T20:37:28.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2685: test_osd_negative_filestore_merge_threshold: sudo ceph daemon osd.0 config set filestore_merge_threshold -1 2026-03-31T20:37:28.537 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:37:28.537 INFO:tasks.workunit.client.0.vm03.stdout: "success": "filestore_merge_threshold = '' (not observed, change may require restart) " 2026-03-31T20:37:28.537 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:37:28.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2686: test_osd_negative_filestore_merge_threshold: expect_config_value osd.0 filestore_merge_threshold -1 2026-03-31T20:37:28.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:140: expect_config_value: local target config_opt expected_val val 2026-03-31T20:37:28.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:141: expect_config_value: target=osd.0 2026-03-31T20:37:28.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:142: expect_config_value: config_opt=filestore_merge_threshold 2026-03-31T20:37:28.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:143: expect_config_value: expected_val=-1 2026-03-31T20:37:28.545 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:145: expect_config_value: get_config_value_or_die osd.0 filestore_merge_threshold 2026-03-31T20:37:28.545 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:120: get_config_value_or_die: local target config_opt raw val 2026-03-31T20:37:28.545 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:122: get_config_value_or_die: target=osd.0 2026-03-31T20:37:28.545 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:123: get_config_value_or_die: config_opt=filestore_merge_threshold 2026-03-31T20:37:28.545 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:125: get_config_value_or_die: sudo ceph daemon osd.0 config get filestore_merge_threshold 2026-03-31T20:37:28.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:125: get_config_value_or_die: raw='{ 2026-03-31T20:37:28.620 INFO:tasks.workunit.client.0.vm03.stderr: "filestore_merge_threshold": "-1" 2026-03-31T20:37:28.620 INFO:tasks.workunit.client.0.vm03.stderr:}' 2026-03-31T20:37:28.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:126: get_config_value_or_die: [[ 0 -ne 0 ]] 2026-03-31T20:37:28.621 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:131: get_config_value_or_die: echo '{' '"filestore_merge_threshold":' '"-1"' '}' 2026-03-31T20:37:28.621 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:131: get_config_value_or_die: sed -e 's/[{} "]//g' 2026-03-31T20:37:28.622 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:131: get_config_value_or_die: raw=filestore_merge_threshold:-1 2026-03-31T20:37:28.622 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:132: get_config_value_or_die: echo filestore_merge_threshold:-1 2026-03-31T20:37:28.622 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:132: get_config_value_or_die: cut -f2 -d: 2026-03-31T20:37:28.623 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:132: get_config_value_or_die: val=-1 2026-03-31T20:37:28.623 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:134: get_config_value_or_die: echo -1 2026-03-31T20:37:28.623 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:135: get_config_value_or_die: return 0 2026-03-31T20:37:28.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:145: expect_config_value: val=-1 2026-03-31T20:37:28.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:147: expect_config_value: [[ -1 != \-\1 ]] 2026-03-31T20:37:28.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:37:28.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_tiering_agent 2026-03-31T20:37:28.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:294: test_tiering_agent: local slow=slow_eviction 2026-03-31T20:37:28.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:295: test_tiering_agent: local fast=fast_eviction 2026-03-31T20:37:28.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:296: test_tiering_agent: ceph osd pool create slow_eviction 1 1 2026-03-31T20:37:29.908 INFO:tasks.workunit.client.0.vm03.stderr:pool 'slow_eviction' already exists 2026-03-31T20:37:29.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:297: test_tiering_agent: ceph osd pool application enable slow_eviction rados 2026-03-31T20:37:31.866 INFO:tasks.workunit.client.0.vm03.stderr:enabled application 'rados' on pool 'slow_eviction' 2026-03-31T20:37:31.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:298: test_tiering_agent: ceph osd pool create fast_eviction 1 1 2026-03-31T20:37:32.934 INFO:tasks.workunit.client.0.vm03.stderr:pool 'fast_eviction' already exists 2026-03-31T20:37:32.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:299: test_tiering_agent: ceph osd tier add slow_eviction fast_eviction 2026-03-31T20:37:33.943 INFO:tasks.workunit.client.0.vm03.stderr:pool 'fast_eviction' is now (or already was) a tier of 'slow_eviction' 2026-03-31T20:37:33.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:300: test_tiering_agent: ceph osd tier cache-mode fast_eviction writeback 2026-03-31T20:37:34.949 INFO:tasks.workunit.client.0.vm03.stderr:set cache-mode for pool 'fast_eviction' to writeback 2026-03-31T20:37:34.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:301: test_tiering_agent: ceph osd tier set-overlay slow_eviction fast_eviction 2026-03-31T20:37:35.958 INFO:tasks.workunit.client.0.vm03.stderr:overlay for 'slow_eviction' is now (or already was) 'fast_eviction' 2026-03-31T20:37:35.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:302: test_tiering_agent: ceph osd pool set fast_eviction hit_set_type bloom 2026-03-31T20:37:37.914 INFO:tasks.workunit.client.0.vm03.stderr:set pool 41 hit_set_type to bloom 2026-03-31T20:37:37.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:303: test_tiering_agent: rados -p slow_eviction put obj1 /etc/group 2026-03-31T20:37:37.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:304: test_tiering_agent: ceph osd pool set fast_eviction target_max_objects 1 2026-03-31T20:37:39.936 INFO:tasks.workunit.client.0.vm03.stderr:set pool 41 target_max_objects to 1 2026-03-31T20:37:39.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:305: test_tiering_agent: ceph osd pool set fast_eviction hit_set_count 1 2026-03-31T20:37:41.947 INFO:tasks.workunit.client.0.vm03.stderr:set pool 41 hit_set_count to 1 2026-03-31T20:37:41.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:306: test_tiering_agent: ceph osd pool set fast_eviction hit_set_period 5 2026-03-31T20:37:44.017 INFO:tasks.workunit.client.0.vm03.stderr:set pool 41 hit_set_period to 5 2026-03-31T20:37:44.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:308: test_tiering_agent: local evicted 2026-03-31T20:37:44.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:309: test_tiering_agent: evicted=false 2026-03-31T20:37:44.030 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:310: test_tiering_agent: seq 1 300 2026-03-31T20:37:44.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:310: test_tiering_agent: for i in `seq 1 300` 2026-03-31T20:37:44.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:311: test_tiering_agent: rados -p fast_eviction ls 2026-03-31T20:37:44.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:311: test_tiering_agent: grep obj1 2026-03-31T20:37:44.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:312: test_tiering_agent: evicted=true 2026-03-31T20:37:44.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:313: test_tiering_agent: break 2026-03-31T20:37:44.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:317: test_tiering_agent: true 2026-03-31T20:37:44.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:319: test_tiering_agent: rados -p slow_eviction get obj1 - 2026-03-31T20:37:44.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:321: test_tiering_agent: evicted=false 2026-03-31T20:37:44.075 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:322: test_tiering_agent: seq 1 300 2026-03-31T20:37:44.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:322: test_tiering_agent: for i in `seq 1 300` 2026-03-31T20:37:44.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:323: test_tiering_agent: rados -p fast_eviction ls 2026-03-31T20:37:44.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:323: test_tiering_agent: grep obj1 2026-03-31T20:37:44.095 INFO:tasks.workunit.client.0.vm03.stdout:obj1 2026-03-31T20:37:44.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:327: test_tiering_agent: sleep 1 2026-03-31T20:37:45.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:322: test_tiering_agent: for i in `seq 1 300` 2026-03-31T20:37:45.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:323: test_tiering_agent: rados -p fast_eviction ls 2026-03-31T20:37:45.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:323: test_tiering_agent: grep obj1 2026-03-31T20:37:45.119 INFO:tasks.workunit.client.0.vm03.stdout:obj1 2026-03-31T20:37:45.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:327: test_tiering_agent: sleep 1 2026-03-31T20:37:46.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:322: test_tiering_agent: for i in `seq 1 300` 2026-03-31T20:37:46.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:323: test_tiering_agent: rados -p fast_eviction ls 2026-03-31T20:37:46.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:323: test_tiering_agent: grep obj1 2026-03-31T20:37:46.144 INFO:tasks.workunit.client.0.vm03.stdout:obj1 2026-03-31T20:37:46.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:327: test_tiering_agent: sleep 1 2026-03-31T20:37:47.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:322: test_tiering_agent: for i in `seq 1 300` 2026-03-31T20:37:47.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:323: test_tiering_agent: rados -p fast_eviction ls 2026-03-31T20:37:47.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:323: test_tiering_agent: grep obj1 2026-03-31T20:37:47.166 INFO:tasks.workunit.client.0.vm03.stdout:obj1 2026-03-31T20:37:47.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:327: test_tiering_agent: sleep 1 2026-03-31T20:37:48.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:322: test_tiering_agent: for i in `seq 1 300` 2026-03-31T20:37:48.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:323: test_tiering_agent: rados -p fast_eviction ls 2026-03-31T20:37:48.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:323: test_tiering_agent: grep obj1 2026-03-31T20:37:48.189 INFO:tasks.workunit.client.0.vm03.stdout:obj1 2026-03-31T20:37:48.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:327: test_tiering_agent: sleep 1 2026-03-31T20:37:49.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:322: test_tiering_agent: for i in `seq 1 300` 2026-03-31T20:37:49.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:323: test_tiering_agent: rados -p fast_eviction ls 2026-03-31T20:37:49.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:323: test_tiering_agent: grep obj1 2026-03-31T20:37:49.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:324: test_tiering_agent: evicted=true 2026-03-31T20:37:49.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:325: test_tiering_agent: break 2026-03-31T20:37:49.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:329: test_tiering_agent: true 2026-03-31T20:37:49.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:330: test_tiering_agent: ceph osd tier remove-overlay slow_eviction 2026-03-31T20:37:49.484 INFO:tasks.workunit.client.0.vm03.stderr:there is now (or already was) no overlay for 'slow_eviction' 2026-03-31T20:37:49.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:331: test_tiering_agent: ceph osd tier remove slow_eviction fast_eviction 2026-03-31T20:37:50.494 INFO:tasks.workunit.client.0.vm03.stderr:pool 'fast_eviction' is now (or already was) not a tier of 'slow_eviction' 2026-03-31T20:37:50.505 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:332: test_tiering_agent: ceph osd pool delete fast_eviction fast_eviction --yes-i-really-really-mean-it 2026-03-31T20:37:51.497 INFO:tasks.workunit.client.0.vm03.stderr:pool 'fast_eviction' does not exist 2026-03-31T20:37:51.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:333: test_tiering_agent: ceph osd pool delete slow_eviction slow_eviction --yes-i-really-really-mean-it 2026-03-31T20:37:52.502 INFO:tasks.workunit.client.0.vm03.stderr:pool 'slow_eviction' does not exist 2026-03-31T20:37:52.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:37:52.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_admin_heap_profiler 2026-03-31T20:37:52.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2623: test_admin_heap_profiler: do_test=1 2026-03-31T20:37:52.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2624: test_admin_heap_profiler: set +e 2026-03-31T20:37:52.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2626: test_admin_heap_profiler: ceph tell osd.0 heap stats 2026-03-31T20:37:52.792 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 tcmalloc heap stats:------------------------------------------------ 2026-03-31T20:37:52.792 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: 42519096 ( 40.5 MiB) Bytes in use by application 2026-03-31T20:37:52.792 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: + 0 ( 0.0 MiB) Bytes in page heap freelist 2026-03-31T20:37:52.793 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: + 3951824 ( 3.8 MiB) Bytes in central cache freelist 2026-03-31T20:37:52.793 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: + 8189696 ( 7.8 MiB) Bytes in transfer cache freelist 2026-03-31T20:37:52.793 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: + 9794040 ( 9.3 MiB) Bytes in thread cache freelists 2026-03-31T20:37:52.793 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: + 2883584 ( 2.8 MiB) Bytes in malloc metadata 2026-03-31T20:37:52.793 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: ------------ 2026-03-31T20:37:52.793 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: = 67338240 ( 64.2 MiB) Actual memory used (physical + swap) 2026-03-31T20:37:52.793 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: + 100245504 ( 95.6 MiB) Bytes released to OS (aka unmapped) 2026-03-31T20:37:52.793 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: ------------ 2026-03-31T20:37:52.793 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: = 167583744 ( 159.8 MiB) Virtual address space used 2026-03-31T20:37:52.793 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: 2026-03-31T20:37:52.793 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: 2694 Spans in use 2026-03-31T20:37:52.793 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: 45 Thread heaps in use 2026-03-31T20:37:52.793 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: 8192 Tcmalloc page size 2026-03-31T20:37:52.793 INFO:tasks.workunit.client.0.vm03.stdout:------------------------------------------------ 2026-03-31T20:37:52.793 INFO:tasks.workunit.client.0.vm03.stdout:Call ReleaseFreeMemory() to release freelist memory to the OS (via madvise()). 2026-03-31T20:37:52.793 INFO:tasks.workunit.client.0.vm03.stdout:Bytes released to the OS take up virtual address space but no physical memory. 2026-03-31T20:37:52.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2627: test_admin_heap_profiler: [[ 0 -eq 22 ]] 2026-03-31T20:37:52.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2631: test_admin_heap_profiler: set -e 2026-03-31T20:37:52.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2633: test_admin_heap_profiler: [[ 1 -eq 0 ]] 2026-03-31T20:37:52.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2635: test_admin_heap_profiler: sudo ceph tell osd.0 heap start_profiler 2026-03-31T20:37:52.874 INFO:tasks.ceph.osd.0.vm03.stderr:Starting tracking the heap 2026-03-31T20:37:52.875 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 started profiler 2026-03-31T20:37:52.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2636: test_admin_heap_profiler: sudo ceph tell osd.0 heap dump 2026-03-31T20:37:52.963 INFO:tasks.ceph.osd.0.vm03.stderr:Dumping heap profile to /var/log/ceph//osd.0.profile.0001.heap (admin request) 2026-03-31T20:37:52.964 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 dumping heap profile now. 2026-03-31T20:37:52.965 INFO:tasks.workunit.client.0.vm03.stdout:------------------------------------------------ 2026-03-31T20:37:52.965 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: 42115992 ( 40.2 MiB) Bytes in use by application 2026-03-31T20:37:52.965 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: + 32768 ( 0.0 MiB) Bytes in page heap freelist 2026-03-31T20:37:52.965 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: + 3959904 ( 3.8 MiB) Bytes in central cache freelist 2026-03-31T20:37:52.965 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: + 8120576 ( 7.7 MiB) Bytes in transfer cache freelist 2026-03-31T20:37:52.965 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: + 10356488 ( 9.9 MiB) Bytes in thread cache freelists 2026-03-31T20:37:52.965 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: + 2883584 ( 2.8 MiB) Bytes in malloc metadata 2026-03-31T20:37:52.965 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: ------------ 2026-03-31T20:37:52.965 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: = 67469312 ( 64.3 MiB) Actual memory used (physical + swap) 2026-03-31T20:37:52.965 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: + 100114432 ( 95.5 MiB) Bytes released to OS (aka unmapped) 2026-03-31T20:37:52.965 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: ------------ 2026-03-31T20:37:52.965 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: = 167583744 ( 159.8 MiB) Virtual address space used 2026-03-31T20:37:52.965 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: 2026-03-31T20:37:52.965 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: 2695 Spans in use 2026-03-31T20:37:52.965 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: 45 Thread heaps in use 2026-03-31T20:37:52.965 INFO:tasks.workunit.client.0.vm03.stdout:MALLOC: 8192 Tcmalloc page size 2026-03-31T20:37:52.965 INFO:tasks.workunit.client.0.vm03.stdout:------------------------------------------------ 2026-03-31T20:37:52.965 INFO:tasks.workunit.client.0.vm03.stdout:Call ReleaseFreeMemory() to release freelist memory to the OS (via madvise()). 2026-03-31T20:37:52.965 INFO:tasks.workunit.client.0.vm03.stdout:Bytes released to the OS take up virtual address space but no physical memory. 2026-03-31T20:37:52.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2637: test_admin_heap_profiler: sudo ceph tell osd.0 heap stop_profiler 2026-03-31T20:37:53.051 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 stopped profiler 2026-03-31T20:37:53.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2638: test_admin_heap_profiler: sudo ceph tell osd.0 heap release 2026-03-31T20:37:53.128 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 releasing free RAM back to system. 2026-03-31T20:37:53.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:37:53.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_osd_tell_help_command 2026-03-31T20:37:53.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2816: test_osd_tell_help_command: ceph tell osd.1 help 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout: Tell osd commands 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout: ================= 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout:bench [] [] OSD benchmark: write - 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout: [] [] byte objects(with ) 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout: , (default count=1G default size=4MB). 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout: Results in log. 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout:bluefs debug_inject_read_zeros Injects 8K zeros into next BlueFS read. 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout: Debug only. 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout:bluefs files list print files in bluefs 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout:bluefs stats Dump internal statistics for bluefs. 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout:bluestore allocator dump block dump allocator free regions 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout:bluestore allocator fragmentation block give allocator fragmentation (0-no 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout: fragmentation, 1-absolute 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout: fragmentation) 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout:bluestore allocator fragmentation build allocator free regions state 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout: histogram block [] histogram 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout: [] 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout:bluestore allocator score block give score on allocator fragmentation (0 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout: -no fragmentation, 1-absolute 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout: fragmentation) 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout:bluestore bluefs device info [] This also includes an estimation for 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout: space available to bluefs at main 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout: device. alloc_size, if set, specifies 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout: the custom bluefs allocation unit size 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout: for the estimation above. 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout:bluestore collections list all collections 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout:bluestore compression stats print compression stats, per collection 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout: [] 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout:bluestore list [] list objects in specific collection 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout: [] 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout:bluestore onode metadata print object internals 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout:cache drop Drop all OSD caches 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout:cache status Get OSD caches statistics 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout:calc_objectstore_db_histogram Generate key value histogram of kvdb( 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout: rocksdb) which used by bluestore 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout:clear_shards_repaired [] clear num_shards_repaired to clear 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout: health warning 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout:cluster_log ... log a message to the cluster log 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout:compact Commpact object store's omap. WARNING: 2026-03-31T20:37:53.407 INFO:tasks.workunit.client.0.vm03.stdout: Compaction probably slows your requests 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:config diff dump diff of current config and default 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout: config 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:config diff get dump diff get : dump diff of 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout: current and default config setting 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:config get config get : get the config value 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:config help [] get config setting schema and 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout: descriptions 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:config set ... config set [ ...]: 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout: set a config variable 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:config show dump current config settings 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:config unset config unset : unset a config 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout: variable 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:counter dump dump all labeled and non-labeled 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout: counters and their values 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:counter schema dump all labeled and non-labeled 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout: counters schemas 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:cpu_profiler run cpu profiling on daemon 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:debug dump_missing dump missing objects to a named file 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:debug kick_recovery_wq set osd_recovery_delay_start to 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:deep-scrub [] Trigger a deep scrub 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:dump_blocked_ops [...] show the blocked ops currently in flight 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:dump_blocked_ops_count [...] show the count of blocked ops currently 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout: in flight 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:dump_blocklist dump blocklisted clients and times 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:dump_historic_ops [...] show recent ops 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:dump_historic_ops_by_duration show slowest recent ops, sorted by 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout: [...] duration 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:dump_historic_slow_ops [...] show slowest recent ops 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:dump_mempools get mempool stats 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:dump_objectstore_kv_stats print statistics of kvdb which used by 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout: bluestore 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:dump_op_pq_state dump op queue state 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:dump_ops_in_flight [...] show the ops currently in flight 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:dump_osd_network [] Dump osd heartbeat network ping times 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:dump_osd_pg_stats Dump OSD PGs' statistics 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:dump_pg_recovery_stats dump pg recovery statistics 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:dump_pgstate_history show recent state history 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:dump_pool_statfs Dump store's statistics for the given 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout: pool 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:dump_recovery_reservations show recovery reservations 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:dump_scrub_reservations show scrub reservations 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:dump_scrubs print scheduled scrubs 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:dump_watchers show clients which have active watches, 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout: and on which objects 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:flush_journal flush the journal to permanent store 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:flush_pg_stats flush pg stats 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:flush_store_cache Flush bluestore internal cache 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:get_command_descriptions list available commands 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:get_heap_property get malloc extension heap property 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:get_latest_osdmap force osd to update the latest map from 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout: the mon 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:get_mapped_pools dump pools whose PG(s) are mapped to 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout: this OSD. 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:getomap output entire object map 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:git_version get git sha1 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:heap [] 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:help list available commands 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:injectargs ... inject configuration arguments into 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout: running daemon 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:injectdataerr inject data error to an object 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout: [] 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:injectecclearreaderr clear read error injects for object in 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout: [] an EC pool 2026-03-31T20:37:53.408 INFO:tasks.workunit.client.0.vm03.stdout:injectecclearwriteerr clear write error inject for object in 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout: [] an EC pool 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:injectecreaderr inject error for read of object in an 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout: [] [] [] 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:injectecwriteerr inject error for write of object in an 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout: [] [] [] 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:injectfull [] [] Inject a full disk (optional count 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout: times) 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:injectmdataerr inject metadata error to an object 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout: [] 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:list_devices list OSD devices. 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:list_unfound [] [] list unfound objects on this pg, 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout: perhaps starting at an offset given in 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout: JSON 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:log dump pg_log of a specific pg 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:log dump dump recent log entries to log file 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:log flush flush log entries to log file 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:log reopen reopen log file 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:mark_unfound_lost [] lost, either removing or reverting to 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout: a prior version if one is available 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:messenger dump [] [.. 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout: .] [--tcp-info] 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:objecter_requests show in-progress osd requests 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:ops [...] show the ops currently in flight 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:perf dump [] [] dump non-labeled counters and their 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout: values 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:perf histogram dump [] dump perf histogram values 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout: [] 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:perf histogram schema dump perf histogram schema 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:perf reset perf reset : perf reset all or 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout: one perfcounter name 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:perf schema dump non-labeled counters schemas 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:query show details of a specific pg 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:raise [] [--cancel] [ to the daemon 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout: float>] process, optionally delaying 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout: seconds; when --after is used, the 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout: program will fork before sleeping, 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout: which allows to schedule signal 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout: delivery to a stopped daemon; it's 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout: possible to --cancel a pending signal 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout: delivery. can be in the forms 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout: '9', '-9', 'kill', '-KILL'. Use `raise 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout: -l` to list known signal names. 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:reset_pg_recovery_stats reset pg recovery statistics 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:reset_purged_snaps_last Reset the superblock's purged_snaps_last 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:rmomapkey remove omap key 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:rotate-key rotate live authentication key 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:rotate-stored-key Update the stored osd_key 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:schedule-deep-scrub [] [] 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:schedule-scrub [] [] Schedule a scrub 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:scrub [] Trigger a scrub 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:scrub_purged_snaps Scrub purged_snaps vs snapmapper index 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:scrubdebug [] 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:send_beacon send OSD beacon to mon immediately 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:set_heap_property 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:set_recovery_delay [] Delay osd recovery by specified seconds 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:setomapheader
set omap header 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:setomapval set omap key 2026-03-31T20:37:53.409 INFO:tasks.workunit.client.0.vm03.stdout:smart [] probe OSD devices for SMART data. 2026-03-31T20:37:53.410 INFO:tasks.workunit.client.0.vm03.stdout:status high-level status of OSD 2026-03-31T20:37:53.410 INFO:tasks.workunit.client.0.vm03.stdout:trim stale osdmaps cleanup any existing osdmap from the 2026-03-31T20:37:53.410 INFO:tasks.workunit.client.0.vm03.stdout: store in the range of 0 up to the 2026-03-31T20:37:53.410 INFO:tasks.workunit.client.0.vm03.stdout: superblock's oldest_map. 2026-03-31T20:37:53.410 INFO:tasks.workunit.client.0.vm03.stdout:truncobj truncate object to length 2026-03-31T20:37:53.410 INFO:tasks.workunit.client.0.vm03.stdout:version get ceph version 2026-03-31T20:37:53.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2817: test_osd_tell_help_command: expect_false ceph tell osd.100 help 2026-03-31T20:37:53.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:37:53.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph tell osd.100 help 2026-03-31T20:37:53.477 INFO:tasks.workunit.client.0.vm03.stderr:WARN: the service id you provided does not exist. service id should be one of 0/1/2. 2026-03-31T20:37:53.477 INFO:tasks.workunit.client.0.vm03.stderr:target osd.100 doesn't exists, please pass correct target to tell command, such as mon.a/osd.1/mds.a/mgr 2026-03-31T20:37:53.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:37:53.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:37:53.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_osd_compact 2026-03-31T20:37:53.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2822: test_osd_compact: ceph tell osd.1 compact 2026-03-31T20:37:53.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2823: test_osd_compact: sudo ceph daemon osd.1 compact 2026-03-31T20:37:53.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:37:54.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_per_pool_scrub_status 2026-03-31T20:37:54.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2863: test_per_pool_scrub_status: ceph osd pool create noscrub_pool 16 2026-03-31T20:37:54.523 INFO:tasks.workunit.client.0.vm03.stderr:pool 'noscrub_pool' already exists 2026-03-31T20:37:54.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2864: test_per_pool_scrub_status: ceph osd pool create noscrub_pool2 16 2026-03-31T20:37:55.528 INFO:tasks.workunit.client.0.vm03.stderr:pool 'noscrub_pool2' already exists 2026-03-31T20:37:55.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2865: test_per_pool_scrub_status: ceph -s 2026-03-31T20:37:55.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2865: test_per_pool_scrub_status: expect_false grep -q 'Some pool(s) have the.*scrub.* flag(s) set' 2026-03-31T20:37:55.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:37:55.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep -q 'Some pool(s) have the.*scrub.* flag(s) set' 2026-03-31T20:37:55.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:37:55.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2866: test_per_pool_scrub_status: ceph -s --format json 2026-03-31T20:37:55.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2867: test_per_pool_scrub_status: jq .health.checks.POOL_SCRUB_FLAGS.summary.message 2026-03-31T20:37:55.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2868: test_per_pool_scrub_status: expect_false grep -q 'Some pool(s) have the.*scrub.* flag(s) set' 2026-03-31T20:37:55.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:37:55.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep -q 'Some pool(s) have the.*scrub.* flag(s) set' 2026-03-31T20:37:56.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:37:56.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2869: test_per_pool_scrub_status: ceph report 2026-03-31T20:37:56.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2869: test_per_pool_scrub_status: jq .health.checks.POOL_SCRUB_FLAGS.detail 2026-03-31T20:37:56.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2870: test_per_pool_scrub_status: expect_false grep -q 'Pool .* has .*scrub.* flag' 2026-03-31T20:37:56.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:37:56.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep -q 'Pool .* has .*scrub.* flag' 2026-03-31T20:37:56.363 INFO:tasks.workunit.client.0.vm03.stderr:report 1809581128 2026-03-31T20:37:56.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:37:56.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2871: test_per_pool_scrub_status: ceph health detail 2026-03-31T20:37:56.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2871: test_per_pool_scrub_status: jq .health.checks.POOL_SCRUB_FLAGS.detail 2026-03-31T20:37:56.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2872: test_per_pool_scrub_status: expect_false grep -q 'Pool .* has .*scrub.* flag' 2026-03-31T20:37:56.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:37:56.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: grep -q 'Pool .* has .*scrub.* flag' 2026-03-31T20:37:56.642 INFO:tasks.workunit.client.0.vm03.stderr:parse error: Invalid numeric literal at line 1, column 11 2026-03-31T20:37:56.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:37:56.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2874: test_per_pool_scrub_status: ceph osd pool set noscrub_pool noscrub 1 2026-03-31T20:37:58.501 INFO:tasks.workunit.client.0.vm03.stderr:set pool 42 noscrub to 1 2026-03-31T20:37:58.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2875: test_per_pool_scrub_status: ceph -s 2026-03-31T20:37:58.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2875: test_per_pool_scrub_status: expect_true grep -q 'Some pool(s) have the noscrub flag(s) set' 2026-03-31T20:37:58.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:58.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: grep -q 'Some pool(s) have the noscrub flag(s) set' 2026-03-31T20:37:58.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:58.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2876: test_per_pool_scrub_status: ceph -s --format json 2026-03-31T20:37:58.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2877: test_per_pool_scrub_status: jq .health.checks.POOL_SCRUB_FLAGS.summary.message 2026-03-31T20:37:58.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2878: test_per_pool_scrub_status: expect_true grep -q 'Some pool(s) have the noscrub flag(s) set' 2026-03-31T20:37:58.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:58.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: grep -q 'Some pool(s) have the noscrub flag(s) set' 2026-03-31T20:37:59.065 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:59.065 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2879: test_per_pool_scrub_status: ceph report 2026-03-31T20:37:59.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2879: test_per_pool_scrub_status: jq .health.checks.POOL_SCRUB_FLAGS.detail 2026-03-31T20:37:59.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2880: test_per_pool_scrub_status: expect_true grep -q 'Pool noscrub_pool has noscrub flag' 2026-03-31T20:37:59.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:59.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: grep -q 'Pool noscrub_pool has noscrub flag' 2026-03-31T20:37:59.327 INFO:tasks.workunit.client.0.vm03.stderr:report 2615444928 2026-03-31T20:37:59.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:59.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2881: test_per_pool_scrub_status: ceph health detail 2026-03-31T20:37:59.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2881: test_per_pool_scrub_status: expect_true grep -q 'Pool noscrub_pool has noscrub flag' 2026-03-31T20:37:59.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:37:59.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: grep -q 'Pool noscrub_pool has noscrub flag' 2026-03-31T20:37:59.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:37:59.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2883: test_per_pool_scrub_status: ceph osd pool set noscrub_pool nodeep-scrub 1 2026-03-31T20:38:01.518 INFO:tasks.workunit.client.0.vm03.stderr:set pool 42 nodeep-scrub to 1 2026-03-31T20:38:01.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2884: test_per_pool_scrub_status: ceph osd pool set noscrub_pool2 nodeep-scrub 1 2026-03-31T20:38:03.526 INFO:tasks.workunit.client.0.vm03.stderr:set pool 43 nodeep-scrub to 1 2026-03-31T20:38:03.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2885: test_per_pool_scrub_status: ceph -s 2026-03-31T20:38:03.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2885: test_per_pool_scrub_status: expect_true grep -q 'Some pool(s) have the noscrub, nodeep-scrub flag(s) set' 2026-03-31T20:38:03.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:03.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: grep -q 'Some pool(s) have the noscrub, nodeep-scrub flag(s) set' 2026-03-31T20:38:03.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:03.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2886: test_per_pool_scrub_status: ceph -s --format json 2026-03-31T20:38:03.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2887: test_per_pool_scrub_status: jq .health.checks.POOL_SCRUB_FLAGS.summary.message 2026-03-31T20:38:03.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2888: test_per_pool_scrub_status: expect_true grep -q 'Some pool(s) have the noscrub, nodeep-scrub flag(s) set' 2026-03-31T20:38:03.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:03.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: grep -q 'Some pool(s) have the noscrub, nodeep-scrub flag(s) set' 2026-03-31T20:38:04.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:04.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2889: test_per_pool_scrub_status: ceph report 2026-03-31T20:38:04.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2889: test_per_pool_scrub_status: jq .health.checks.POOL_SCRUB_FLAGS.detail 2026-03-31T20:38:04.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2890: test_per_pool_scrub_status: expect_true grep -q 'Pool noscrub_pool has noscrub flag' 2026-03-31T20:38:04.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:04.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: grep -q 'Pool noscrub_pool has noscrub flag' 2026-03-31T20:38:04.377 INFO:tasks.workunit.client.0.vm03.stderr:report 3728345664 2026-03-31T20:38:04.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:04.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2891: test_per_pool_scrub_status: ceph report 2026-03-31T20:38:04.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2891: test_per_pool_scrub_status: jq .health.checks.POOL_SCRUB_FLAGS.detail 2026-03-31T20:38:04.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2892: test_per_pool_scrub_status: expect_true grep -q 'Pool noscrub_pool has nodeep-scrub flag' 2026-03-31T20:38:04.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:04.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: grep -q 'Pool noscrub_pool has nodeep-scrub flag' 2026-03-31T20:38:04.658 INFO:tasks.workunit.client.0.vm03.stderr:report 575521567 2026-03-31T20:38:04.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:04.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2893: test_per_pool_scrub_status: ceph report 2026-03-31T20:38:04.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2893: test_per_pool_scrub_status: jq .health.checks.POOL_SCRUB_FLAGS.detail 2026-03-31T20:38:04.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2894: test_per_pool_scrub_status: expect_true grep -q 'Pool noscrub_pool2 has nodeep-scrub flag' 2026-03-31T20:38:04.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:04.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: grep -q 'Pool noscrub_pool2 has nodeep-scrub flag' 2026-03-31T20:38:04.938 INFO:tasks.workunit.client.0.vm03.stderr:report 1581074471 2026-03-31T20:38:04.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:04.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2895: test_per_pool_scrub_status: ceph health detail 2026-03-31T20:38:04.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2895: test_per_pool_scrub_status: expect_true grep -q 'Pool noscrub_pool has noscrub flag' 2026-03-31T20:38:04.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:04.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: grep -q 'Pool noscrub_pool has noscrub flag' 2026-03-31T20:38:05.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:05.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2896: test_per_pool_scrub_status: ceph health detail 2026-03-31T20:38:05.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2896: test_per_pool_scrub_status: expect_true grep -q 'Pool noscrub_pool has nodeep-scrub flag' 2026-03-31T20:38:05.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:05.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: grep -q 'Pool noscrub_pool has nodeep-scrub flag' 2026-03-31T20:38:05.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:05.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2897: test_per_pool_scrub_status: ceph health detail 2026-03-31T20:38:05.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2897: test_per_pool_scrub_status: expect_true grep -q 'Pool noscrub_pool2 has nodeep-scrub flag' 2026-03-31T20:38:05.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:05.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: grep -q 'Pool noscrub_pool2 has nodeep-scrub flag' 2026-03-31T20:38:05.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:05.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2899: test_per_pool_scrub_status: ceph osd pool rm noscrub_pool noscrub_pool --yes-i-really-really-mean-it 2026-03-31T20:38:06.604 INFO:tasks.workunit.client.0.vm03.stderr:pool 'noscrub_pool' does not exist 2026-03-31T20:38:06.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2900: test_per_pool_scrub_status: ceph osd pool rm noscrub_pool2 noscrub_pool2 --yes-i-really-really-mean-it 2026-03-31T20:38:07.611 INFO:tasks.workunit.client.0.vm03.stderr:pool 'noscrub_pool2' does not exist 2026-03-31T20:38:07.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:38:07.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_osd_messenger_dump 2026-03-31T20:38:07.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2927: test_osd_messenger_dump: do_messenger_dump_basics_test osd.0 2026-03-31T20:38:07.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2905: do_messenger_dump_basics_test: local target=osd.0 2026-03-31T20:38:07.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2906: do_messenger_dump_basics_test: ceph tell osd.0 messenger dump 2026-03-31T20:38:07.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2906: do_messenger_dump_basics_test: expect_true jq --exit-status '.messengers | length > 0' 2026-03-31T20:38:07.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:07.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messengers | length > 0' 2026-03-31T20:38:07.931 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:07.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:07.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: ceph tell osd.0 messenger dump 2026-03-31T20:38:07.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: jq -r '.messengers[]' 2026-03-31T20:38:07.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: read messenger 2026-03-31T20:38:08.025 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: ceph tell osd.0 messenger dump client all 2026-03-31T20:38:08.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: dump='{ 2026-03-31T20:38:08.117 INFO:tasks.workunit.client.0.vm03.stderr: "name": "client", 2026-03-31T20:38:08.117 INFO:tasks.workunit.client.0.vm03.stderr: "messenger": { 2026-03-31T20:38:08.117 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 6845614358180639698, 2026-03-31T20:38:08.117 INFO:tasks.workunit.client.0.vm03.stderr: "my_name": { 2026-03-31T20:38:08.117 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.117 INFO:tasks.workunit.client.0.vm03.stderr: "num": 0 2026-03-31T20:38:08.117 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.117 INFO:tasks.workunit.client.0.vm03.stderr: "my_addrs": { 2026-03-31T20:38:08.117 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.117 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.117 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6804", 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "listen_sockets": [ 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 19, 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "dispatch_queue": { 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "length": 0, 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "max_age_ago": "0s" 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "connections_count": 5, 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "connections": [ 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 529836283 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358180639698, 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 73, 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 74, 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "type": 8, 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "id": "admin" 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "id": 12268, 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 12268, 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 529836283 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "20m", 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "0s", 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "0.000124863s", 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 49, 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6804", 2026-03-31T20:38:08.118 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 529836283 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "crc", 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 5, 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358180639698, 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 70, 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 66, 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "type": 4, 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "id": "2" 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "id": 2, 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 4144, 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.119 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "20m", 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "24s", 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "24s", 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 37, 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6804", 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "crc", 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 1, 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2742811205 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358180639698, 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 69, 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 44, 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "type": 16, 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "id": "x" 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "id": 6663, 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 6663, 2026-03-31T20:38:08.120 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2742811205 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "20m", 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "44s", 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "44s", 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 39, 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6804", 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2742811205 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 8, 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358180639698, 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 71, 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 43, 2026-03-31T20:38:08.121 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mgr", 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "12m", 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "11m", 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "11m", 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 33, 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 1, 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:38:08.122 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6789", 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358180639698, 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 57, 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 4, 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6789", 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "16m", 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "0.555999s", 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "0.554277s", 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 27, 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.123 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 22, 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "anon_conns": [], 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "accepting_conns": [], 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "deleted_conns": [], 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "local_connection": [ 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_NONE", 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358180639698, 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": true 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 1, 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6804", 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "20m", 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "16m", 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "20m", 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:38:08.124 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.125 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.125 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.125 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.125 INFO:tasks.workunit.client.0.vm03.stderr: "v1": { 2026-03-31T20:38:08.125 INFO:tasks.workunit.client.0.vm03.stderr: "state": "NONE", 2026-03-31T20:38:08.125 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.125 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:38:08.125 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown" 2026-03-31T20:38:08.125 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.125 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.125 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:38:08.125 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.125 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.125 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.125 INFO:tasks.workunit.client.0.vm03.stderr:}' 2026-03-31T20:38:08.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2909: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("messenger")' 2026-03-31T20:38:08.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("messenger")' 2026-03-31T20:38:08.126 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2910: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("name")' 2026-03-31T20:38:08.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("name")' 2026-03-31T20:38:08.137 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2911: do_messenger_dump_basics_test: expect_true jq --arg expected_messenger client --exit-status ' 2026-03-31T20:38:08.137 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:38:08.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --arg expected_messenger client --exit-status ' 2026-03-31T20:38:08.137 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:38:08.146 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2913: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | type == "object"' 2026-03-31T20:38:08.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | type == "object"' 2026-03-31T20:38:08.155 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2914: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | 2026-03-31T20:38:08.155 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:38:08.155 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:38:08.155 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:38:08.155 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:38:08.155 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:38:08.155 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:38:08.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | 2026-03-31T20:38:08.155 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:38:08.155 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:38:08.155 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:38:08.155 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:38:08.156 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:38:08.156 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:38:08.164 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: read messenger 2026-03-31T20:38:08.164 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: ceph tell osd.0 messenger dump cluster all 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: dump='{ 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: "name": "cluster", 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: "messenger": { 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 6845614358181639698, 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: "my_name": { 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: "num": 0 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: "my_addrs": { 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6813", 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 951776786 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: "listen_sockets": [ 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 20, 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: "dispatch_queue": { 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: "length": 0, 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: "max_age_ago": "0s" 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: "connections_count": 2, 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: "connections": [ 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.256 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6809", 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358181639698, 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 72, 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 8, 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6809", 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "10m", 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "1.324s", 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "1.32538s", 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 41, 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6809", 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "crc", 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 5, 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.257 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6801", 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358181639698, 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 63, 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 6, 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6801", 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "10m", 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "1.308s", 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "1.30895s", 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 37, 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6801", 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "crc", 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 6, 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:38:08.258 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "anon_conns": [], 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "accepting_conns": [ 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358181639698, 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 7, 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "type": 4, 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "id": "2" 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "id": 2, 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 4144, 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6809", 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "20m", 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "10m", 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "10m", 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6813", 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 951776786 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6809", 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 3, 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:38:08.259 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "deleted_conns": [ 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358181639698, 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 7, 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "type": 4, 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "id": "2" 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "id": 2, 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 4144, 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6809", 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "20m", 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "10m", 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "10m", 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6813", 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 951776786 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6809", 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 3, 2026-03-31T20:38:08.260 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "local_connection": [ 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_NONE", 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358181639698, 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": true 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 1, 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6813", 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 951776786 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "20m", 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "16m", 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "20m", 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "v1": { 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "state": "NONE", 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown" 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:38:08.261 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.262 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.262 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.262 INFO:tasks.workunit.client.0.vm03.stderr:}' 2026-03-31T20:38:08.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2909: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("messenger")' 2026-03-31T20:38:08.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("messenger")' 2026-03-31T20:38:08.265 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2910: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("name")' 2026-03-31T20:38:08.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("name")' 2026-03-31T20:38:08.274 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2911: do_messenger_dump_basics_test: expect_true jq --arg expected_messenger cluster --exit-status ' 2026-03-31T20:38:08.275 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:38:08.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --arg expected_messenger cluster --exit-status ' 2026-03-31T20:38:08.275 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:38:08.283 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.284 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.284 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2913: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | type == "object"' 2026-03-31T20:38:08.284 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.284 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | type == "object"' 2026-03-31T20:38:08.292 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2914: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | 2026-03-31T20:38:08.293 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:38:08.293 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:38:08.293 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:38:08.293 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:38:08.293 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:38:08.293 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:38:08.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | 2026-03-31T20:38:08.293 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:38:08.293 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:38:08.293 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:38:08.293 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:38:08.293 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:38:08.293 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:38:08.301 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.302 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.302 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: read messenger 2026-03-31T20:38:08.302 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: ceph tell osd.0 messenger dump hb_back_client all 2026-03-31T20:38:08.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: dump='{ 2026-03-31T20:38:08.391 INFO:tasks.workunit.client.0.vm03.stderr: "name": "hb_back_client", 2026-03-31T20:38:08.391 INFO:tasks.workunit.client.0.vm03.stderr: "messenger": { 2026-03-31T20:38:08.391 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 6845614358180639698, 2026-03-31T20:38:08.391 INFO:tasks.workunit.client.0.vm03.stderr: "my_name": { 2026-03-31T20:38:08.391 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.391 INFO:tasks.workunit.client.0.vm03.stderr: "num": 0 2026-03-31T20:38:08.391 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.391 INFO:tasks.workunit.client.0.vm03.stderr: "my_addrs": { 2026-03-31T20:38:08.391 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.391 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.391 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.391 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.391 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:38:08.391 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.391 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.391 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.391 INFO:tasks.workunit.client.0.vm03.stderr: "listen_sockets": [], 2026-03-31T20:38:08.391 INFO:tasks.workunit.client.0.vm03.stderr: "dispatch_queue": { 2026-03-31T20:38:08.391 INFO:tasks.workunit.client.0.vm03.stderr: "length": 0, 2026-03-31T20:38:08.391 INFO:tasks.workunit.client.0.vm03.stderr: "max_age_ago": "0s" 2026-03-31T20:38:08.391 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.391 INFO:tasks.workunit.client.0.vm03.stderr: "connections_count": 2, 2026-03-31T20:38:08.391 INFO:tasks.workunit.client.0.vm03.stderr: "connections": [ 2026-03-31T20:38:08.391 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6811", 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358180639698, 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 61, 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 5, 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6811", 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "10m", 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "2s", 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "2s", 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 38, 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:38:08.392 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6811", 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "crc", 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 3, 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6803", 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358180639698, 2026-03-31T20:38:08.393 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 60, 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 4, 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6803", 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "10m", 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "2s", 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "2s", 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 39, 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6803", 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "crc", 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 3, 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:38:08.394 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "anon_conns": [], 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "accepting_conns": [], 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "deleted_conns": [], 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "local_connection": [ 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_NONE", 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358180639698, 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": true 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 1, 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "20m", 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "16m", 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "20m", 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "v1": { 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "state": "NONE", 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown" 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.395 INFO:tasks.workunit.client.0.vm03.stderr:}' 2026-03-31T20:38:08.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2909: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("messenger")' 2026-03-31T20:38:08.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("messenger")' 2026-03-31T20:38:08.399 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2910: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("name")' 2026-03-31T20:38:08.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("name")' 2026-03-31T20:38:08.410 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2911: do_messenger_dump_basics_test: expect_true jq --arg expected_messenger hb_back_client --exit-status ' 2026-03-31T20:38:08.410 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:38:08.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --arg expected_messenger hb_back_client --exit-status ' 2026-03-31T20:38:08.410 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:38:08.419 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2913: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | type == "object"' 2026-03-31T20:38:08.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | type == "object"' 2026-03-31T20:38:08.430 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2914: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | 2026-03-31T20:38:08.431 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:38:08.431 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:38:08.431 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:38:08.431 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:38:08.431 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:38:08.431 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:38:08.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | 2026-03-31T20:38:08.431 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:38:08.431 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:38:08.431 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:38:08.431 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:38:08.431 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:38:08.431 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:38:08.442 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: read messenger 2026-03-31T20:38:08.442 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: ceph tell osd.0 messenger dump hb_back_server all 2026-03-31T20:38:08.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: dump='{ 2026-03-31T20:38:08.533 INFO:tasks.workunit.client.0.vm03.stderr: "name": "hb_back_server", 2026-03-31T20:38:08.533 INFO:tasks.workunit.client.0.vm03.stderr: "messenger": { 2026-03-31T20:38:08.533 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 6845614358180639698, 2026-03-31T20:38:08.533 INFO:tasks.workunit.client.0.vm03.stderr: "my_name": { 2026-03-31T20:38:08.533 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.533 INFO:tasks.workunit.client.0.vm03.stderr: "num": 0 2026-03-31T20:38:08.533 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.533 INFO:tasks.workunit.client.0.vm03.stderr: "my_addrs": { 2026-03-31T20:38:08.533 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.533 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.533 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.533 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6807", 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "listen_sockets": [ 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 22, 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "dispatch_queue": { 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "length": 0, 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "max_age_ago": "0s" 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "connections_count": 0, 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "connections": [], 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "anon_conns": [ 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358180639698, 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 65, 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 4, 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "type": 4, 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "id": "2" 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "id": 2, 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 4144, 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "20m", 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "4s", 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "4s", 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 40, 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6807", 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.534 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "crc", 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 3, 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358180639698, 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 68, 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 5, 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "type": 4, 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "id": "1" 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "id": 1, 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 4145, 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "20m", 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "1.68s", 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "1.6805s", 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 41, 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6807", 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.535 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "crc", 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 3, 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "accepting_conns": [ 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358180639698, 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 65, 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 4, 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "type": 4, 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "id": "2" 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "id": 2, 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 4144, 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "20m", 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "4s", 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "4s", 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 40, 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6807", 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.536 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "crc", 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 3, 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358180639698, 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 68, 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 5, 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "type": 4, 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "id": "1" 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "id": 1, 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 4145, 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "20m", 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "1.68s", 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "1.68052s", 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 41, 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6807", 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "crc", 2026-03-31T20:38:08.537 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 3, 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "deleted_conns": [], 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "local_connection": [ 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_NONE", 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358180639698, 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": true 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 1, 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6807", 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "20m", 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "16m", 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "20m", 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "v1": { 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "state": "NONE", 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.538 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:38:08.539 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown" 2026-03-31T20:38:08.539 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.539 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.539 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:38:08.539 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.539 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.539 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.539 INFO:tasks.workunit.client.0.vm03.stderr:}' 2026-03-31T20:38:08.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2909: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("messenger")' 2026-03-31T20:38:08.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("messenger")' 2026-03-31T20:38:08.542 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2910: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("name")' 2026-03-31T20:38:08.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("name")' 2026-03-31T20:38:08.555 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2911: do_messenger_dump_basics_test: expect_true jq --arg expected_messenger hb_back_server --exit-status ' 2026-03-31T20:38:08.556 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:38:08.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --arg expected_messenger hb_back_server --exit-status ' 2026-03-31T20:38:08.556 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:38:08.565 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2913: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | type == "object"' 2026-03-31T20:38:08.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | type == "object"' 2026-03-31T20:38:08.574 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2914: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | 2026-03-31T20:38:08.574 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:38:08.574 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:38:08.574 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:38:08.574 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:38:08.574 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:38:08.574 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:38:08.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | 2026-03-31T20:38:08.574 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:38:08.574 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:38:08.574 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:38:08.574 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:38:08.574 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:38:08.574 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:38:08.583 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: read messenger 2026-03-31T20:38:08.583 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: ceph tell osd.0 messenger dump hb_front_client all 2026-03-31T20:38:08.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: dump='{ 2026-03-31T20:38:08.671 INFO:tasks.workunit.client.0.vm03.stderr: "name": "hb_front_client", 2026-03-31T20:38:08.671 INFO:tasks.workunit.client.0.vm03.stderr: "messenger": { 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 6845614358180639698, 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "my_name": { 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "num": 0 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "my_addrs": { 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "listen_sockets": [], 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "dispatch_queue": { 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "length": 0, 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "max_age_ago": "0s" 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "connections_count": 2, 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "connections": [ 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6810", 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358180639698, 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 64, 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 5, 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6810", 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "10m", 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "2s", 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "2s", 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 13, 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:38:08.672 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6810", 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "crc", 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 3, 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6802", 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358180639698, 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 62, 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 4, 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6802", 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "10m", 2026-03-31T20:38:08.673 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "2s", 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "2s", 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 35, 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6802", 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "crc", 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 3, 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "anon_conns": [], 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "accepting_conns": [], 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "deleted_conns": [], 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "local_connection": [ 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_NONE", 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358180639698, 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": true 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 1, 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.674 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "20m", 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "16m", 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "20m", 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: "v1": { 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: "state": "NONE", 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown" 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr:}' 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2909: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("messenger")' 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("messenger")' 2026-03-31T20:38:08.680 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2910: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("name")' 2026-03-31T20:38:08.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("name")' 2026-03-31T20:38:08.689 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2911: do_messenger_dump_basics_test: expect_true jq --arg expected_messenger hb_front_client --exit-status ' 2026-03-31T20:38:08.689 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:38:08.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --arg expected_messenger hb_front_client --exit-status ' 2026-03-31T20:38:08.689 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:38:08.699 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2913: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | type == "object"' 2026-03-31T20:38:08.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | type == "object"' 2026-03-31T20:38:08.708 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2914: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | 2026-03-31T20:38:08.709 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:38:08.709 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:38:08.709 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:38:08.709 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:38:08.709 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:38:08.709 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:38:08.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | 2026-03-31T20:38:08.709 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:38:08.709 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:38:08.709 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:38:08.709 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:38:08.709 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:38:08.709 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:38:08.718 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: read messenger 2026-03-31T20:38:08.718 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: ceph tell osd.0 messenger dump hb_front_server all 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: dump='{ 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "name": "hb_front_server", 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "messenger": { 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 6845614358180639698, 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "my_name": { 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "num": 0 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "my_addrs": { 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6806", 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "listen_sockets": [ 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 21, 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "dispatch_queue": { 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "length": 0, 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "max_age_ago": "0s" 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "connections_count": 0, 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "connections": [], 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "anon_conns": [ 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358180639698, 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 66, 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 4, 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "type": 4, 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "id": "2" 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.804 INFO:tasks.workunit.client.0.vm03.stderr: "id": 2, 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 4144, 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "20m", 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "4s", 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "4s", 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 14, 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6806", 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "crc", 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 3, 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358180639698, 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 67, 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 5, 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "type": 4, 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "id": "1" 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "id": 1, 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 4145, 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.805 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "20m", 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "1.952s", 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "1.95151s", 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 38, 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6806", 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "crc", 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 3, 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "accepting_conns": [ 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358180639698, 2026-03-31T20:38:08.806 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 66, 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 4, 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "type": 4, 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "id": "2" 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "id": 2, 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 4144, 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "20m", 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "4s", 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "4s", 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 14, 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6806", 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "crc", 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 3, 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358180639698, 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:38:08.807 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 67, 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 5, 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "type": 4, 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "id": "1" 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "id": 1, 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 4145, 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "20m", 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "1.952s", 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "1.95153s", 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 38, 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6806", 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "crc", 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 3, 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "deleted_conns": [], 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "local_connection": [ 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_NONE", 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358180639698, 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": true 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.808 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 1, 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6806", 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "20m", 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "16m", 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "20m", 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "v1": { 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "state": "NONE", 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown" 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr:}' 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2909: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("messenger")' 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("messenger")' 2026-03-31T20:38:08.813 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2910: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("name")' 2026-03-31T20:38:08.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("name")' 2026-03-31T20:38:08.822 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2911: do_messenger_dump_basics_test: expect_true jq --arg expected_messenger hb_front_server --exit-status ' 2026-03-31T20:38:08.822 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:38:08.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --arg expected_messenger hb_front_server --exit-status ' 2026-03-31T20:38:08.822 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:38:08.831 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2913: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | type == "object"' 2026-03-31T20:38:08.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | type == "object"' 2026-03-31T20:38:08.841 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2914: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | 2026-03-31T20:38:08.841 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:38:08.841 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:38:08.841 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:38:08.841 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:38:08.841 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:38:08.841 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:38:08.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | 2026-03-31T20:38:08.841 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:38:08.841 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:38:08.841 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:38:08.841 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:38:08.841 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:38:08.841 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:38:08.850 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: read messenger 2026-03-31T20:38:08.850 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: ceph tell osd.0 messenger dump ms_objecter all 2026-03-31T20:38:08.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: dump='{ 2026-03-31T20:38:08.942 INFO:tasks.workunit.client.0.vm03.stderr: "name": "ms_objecter", 2026-03-31T20:38:08.942 INFO:tasks.workunit.client.0.vm03.stderr: "messenger": { 2026-03-31T20:38:08.942 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 6845614358180639698, 2026-03-31T20:38:08.942 INFO:tasks.workunit.client.0.vm03.stderr: "my_name": { 2026-03-31T20:38:08.942 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.942 INFO:tasks.workunit.client.0.vm03.stderr: "num": 0 2026-03-31T20:38:08.942 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.942 INFO:tasks.workunit.client.0.vm03.stderr: "my_addrs": { 2026-03-31T20:38:08.942 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.942 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.942 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.942 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.942 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:38:08.942 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.942 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.942 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.942 INFO:tasks.workunit.client.0.vm03.stderr: "listen_sockets": [], 2026-03-31T20:38:08.942 INFO:tasks.workunit.client.0.vm03.stderr: "dispatch_queue": { 2026-03-31T20:38:08.942 INFO:tasks.workunit.client.0.vm03.stderr: "length": 0, 2026-03-31T20:38:08.942 INFO:tasks.workunit.client.0.vm03.stderr: "max_age_ago": "0s" 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "connections_count": 2, 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "connections": [ 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6800", 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358180639698, 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 76, 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 3, 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6800", 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "13m", 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "13m", 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "13m", 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 11, 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6800", 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "crc", 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 17, 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:38:08.943 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6808", 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358180639698, 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 75, 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 2, 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6808", 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "30s", 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "30s", 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 35, 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6808", 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.944 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "crc", 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 12, 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "anon_conns": [], 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "accepting_conns": [], 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "deleted_conns": [], 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "local_connection": [ 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_NONE", 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 6845614358180639698, 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": true 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 1, 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "20m", 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "16m", 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "20m", 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:38:08.945 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.946 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.946 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:38:08.946 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:38:08.946 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:38:08.946 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:38:08.946 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.946 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:38:08.946 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:38:08.946 INFO:tasks.workunit.client.0.vm03.stderr: "v1": { 2026-03-31T20:38:08.946 INFO:tasks.workunit.client.0.vm03.stderr: "state": "NONE", 2026-03-31T20:38:08.946 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:38:08.946 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:38:08.946 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown" 2026-03-31T20:38:08.946 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.946 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:38:08.946 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:38:08.946 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.946 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:38:08.946 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:38:08.946 INFO:tasks.workunit.client.0.vm03.stderr:}' 2026-03-31T20:38:08.946 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2909: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("messenger")' 2026-03-31T20:38:08.946 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.946 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("messenger")' 2026-03-31T20:38:08.952 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2910: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("name")' 2026-03-31T20:38:08.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("name")' 2026-03-31T20:38:08.961 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2911: do_messenger_dump_basics_test: expect_true jq --arg expected_messenger ms_objecter --exit-status ' 2026-03-31T20:38:08.962 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:38:08.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --arg expected_messenger ms_objecter --exit-status ' 2026-03-31T20:38:08.962 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:38:08.971 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2913: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | type == "object"' 2026-03-31T20:38:08.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | type == "object"' 2026-03-31T20:38:08.982 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2914: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | 2026-03-31T20:38:08.982 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:38:08.982 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:38:08.982 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:38:08.982 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:38:08.982 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:38:08.982 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:38:08.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:08.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | 2026-03-31T20:38:08.982 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:38:08.982 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:38:08.982 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:38:08.982 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:38:08.982 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:38:08.982 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:38:08.992 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:38:08.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:08.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: read messenger 2026-03-31T20:38:08.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:38:09.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mds_tell 2026-03-31T20:38:09.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:920: test_mds_tell: local FS_NAME=cephfs 2026-03-31T20:38:09.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:921: test_mds_tell: mds_exists 2026-03-31T20:38:09.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:861: mds_exists: ceph auth ls 2026-03-31T20:38:09.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:861: mds_exists: grep '^mds' 2026-03-31T20:38:09.499 INFO:tasks.workunit.client.0.vm03.stdout:Skipping test, no MDS found 2026-03-31T20:38:09.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:922: test_mds_tell: echo 'Skipping test, no MDS found' 2026-03-31T20:38:09.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:923: test_mds_tell: return 2026-03-31T20:38:09.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:38:09.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mon_mds 2026-03-31T20:38:09.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:968: test_mon_mds: local FS_NAME=cephfs 2026-03-31T20:38:09.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:969: test_mon_mds: remove_all_fs 2026-03-31T20:38:09.714 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:847: remove_all_fs: ceph fs ls --format=json 2026-03-31T20:38:09.714 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:847: remove_all_fs: python3 -c 'import json; import sys; print('\'' '\''.join([fs['\''name'\''] for fs in json.load(sys.stdin)]))' 2026-03-31T20:38:09.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:847: remove_all_fs: existing_fs= 2026-03-31T20:38:09.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:971: test_mon_mds: ceph osd pool create fs_data 16 2026-03-31T20:38:10.618 INFO:tasks.workunit.client.0.vm03.stderr:pool 'fs_data' already exists 2026-03-31T20:38:10.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:972: test_mon_mds: ceph osd pool create fs_metadata 16 2026-03-31T20:38:11.624 INFO:tasks.workunit.client.0.vm03.stderr:pool 'fs_metadata' already exists 2026-03-31T20:38:11.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:973: test_mon_mds: ceph fs new cephfs fs_metadata fs_data 2026-03-31T20:38:11.805 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:38:11.805+0000 7f8708384640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-31T20:38:12.818 INFO:tasks.workunit.client.0.vm03.stderr:filesystem 'cephfs' already exists 2026-03-31T20:38:12.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:975: test_mon_mds: ceph fs set cephfs cluster_down true 2026-03-31T20:38:14.832 INFO:tasks.workunit.client.0.vm03.stderr:cephfs marked not joinable; MDS cannot join as newly active. WARNING: cluster_down flag is deprecated and will be removed in a future version. Please use "joinable". 2026-03-31T20:38:14.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:976: test_mon_mds: ceph fs set cephfs cluster_down false 2026-03-31T20:38:16.845 INFO:tasks.workunit.client.0.vm03.stderr:cephfs marked joinable; MDS may join as newly active. WARNING: cluster_down flag is deprecated and will be removed in a future version. Please use "joinable". 2026-03-31T20:38:16.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:978: test_mon_mds: ceph fs set cephfs max_mds 2 2026-03-31T20:38:18.874 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:979: test_mon_mds: ceph fs get cephfs 2026-03-31T20:38:18.874 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:979: test_mon_mds: expect_true grep -P -q 'max_mds\t2' 2026-03-31T20:38:18.874 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:18.874 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: grep -P -q 'max_mds\t2' 2026-03-31T20:38:19.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:19.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:980: test_mon_mds: ceph fs set cephfs down false 2026-03-31T20:38:20.867 INFO:tasks.workunit.client.0.vm03.stderr:cephfs is already online 2026-03-31T20:38:20.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:981: test_mon_mds: ceph fs get cephfs 2026-03-31T20:38:20.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:981: test_mon_mds: expect_true grep -P -q 'max_mds\t2' 2026-03-31T20:38:20.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:38:20.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: grep -P -q 'max_mds\t2' 2026-03-31T20:38:21.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:38:21.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:983: test_mon_mds: ceph mds compat rm_incompat 4 2026-03-31T20:38:22.883 INFO:tasks.workunit.client.0.vm03.stderr:incompat feature 4 not present in compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-31T20:38:22.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:984: test_mon_mds: ceph mds compat rm_incompat 4 2026-03-31T20:38:24.902 INFO:tasks.workunit.client.0.vm03.stderr:incompat feature 4 not present in compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-31T20:38:24.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:988: test_mon_mds: fail_all_mds cephfs 2026-03-31T20:38:24.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:831: fail_all_mds: fs_name=cephfs 2026-03-31T20:38:24.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:832: fail_all_mds: ceph fs set cephfs cluster_down true 2026-03-31T20:38:27.047 INFO:tasks.workunit.client.0.vm03.stderr:cephfs marked not joinable; MDS cannot join as newly active. WARNING: cluster_down flag is deprecated and will be removed in a future version. Please use "joinable". 2026-03-31T20:38:27.059 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:833: fail_all_mds: get_mds_gids cephfs 2026-03-31T20:38:27.060 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:825: get_mds_gids: fs_name=cephfs 2026-03-31T20:38:27.060 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:826: get_mds_gids: ceph fs get cephfs --format=json 2026-03-31T20:38:27.060 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:826: get_mds_gids: python3 -c 'import json; import sys; print('\'' '\''.join([m['\''gid'\''].__str__() for m in json.load(sys.stdin)['\''mdsmap'\'']['\''info'\''].values()]))' 2026-03-31T20:38:27.309 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:833: fail_all_mds: mds_gids= 2026-03-31T20:38:27.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:837: fail_all_mds: check_mds_active cephfs 2026-03-31T20:38:27.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:804: check_mds_active: fs_name=cephfs 2026-03-31T20:38:27.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:805: check_mds_active: ceph fs get cephfs 2026-03-31T20:38:27.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:805: check_mds_active: grep active 2026-03-31T20:38:27.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:990: test_mon_mds: ceph mds compat show 2026-03-31T20:38:27.827 INFO:tasks.workunit.client.0.vm03.stdout:compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-31T20:38:27.840 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:991: test_mon_mds: ceph fs dump 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:e17 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:btime 2026-03-31T20:38:27:044988+0000 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:legacy client fscid: 1 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:fs_name cephfs 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:epoch 17 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:flags 13 allow_snaps allow_multimds_snaps 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:created 2026-03-31T20:38:11.806555+0000 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:modified 2026-03-31T20:38:26.115701+0000 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:tableserver 0 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:root 0 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:session_timeout 60 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:session_autoclose 300 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:max_file_size 1099511627776 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:max_xattr_size 65536 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:required_client_features {} 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:last_failure 0 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:last_failure_osd_epoch 0 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:max_mds 2 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:in 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:up {} 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:failed 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:damaged 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:stopped 2026-03-31T20:38:28.089 INFO:tasks.workunit.client.0.vm03.stdout:data_pools [44] 2026-03-31T20:38:28.090 INFO:tasks.workunit.client.0.vm03.stdout:metadata_pool 45 2026-03-31T20:38:28.090 INFO:tasks.workunit.client.0.vm03.stdout:inline_data disabled 2026-03-31T20:38:28.090 INFO:tasks.workunit.client.0.vm03.stdout:balancer 2026-03-31T20:38:28.090 INFO:tasks.workunit.client.0.vm03.stdout:bal_rank_mask -1 2026-03-31T20:38:28.090 INFO:tasks.workunit.client.0.vm03.stdout:standby_count_wanted 0 2026-03-31T20:38:28.090 INFO:tasks.workunit.client.0.vm03.stdout:qdb_cluster leader: 0 members: 2026-03-31T20:38:28.090 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:38:28.090 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-31T20:38:28.090 INFO:tasks.workunit.client.0.vm03.stderr:dumped fsmap epoch 17 2026-03-31T20:38:28.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:992: test_mon_mds: ceph fs get cephfs 2026-03-31T20:38:28.333 INFO:tasks.workunit.client.0.vm03.stdout:Filesystem 'cephfs' (1) 2026-03-31T20:38:28.333 INFO:tasks.workunit.client.0.vm03.stdout:fs_name cephfs 2026-03-31T20:38:28.333 INFO:tasks.workunit.client.0.vm03.stdout:epoch 17 2026-03-31T20:38:28.333 INFO:tasks.workunit.client.0.vm03.stdout:flags 13 allow_snaps allow_multimds_snaps 2026-03-31T20:38:28.334 INFO:tasks.workunit.client.0.vm03.stdout:created 2026-03-31T20:38:11.806555+0000 2026-03-31T20:38:28.334 INFO:tasks.workunit.client.0.vm03.stdout:modified 2026-03-31T20:38:26.115701+0000 2026-03-31T20:38:28.334 INFO:tasks.workunit.client.0.vm03.stdout:tableserver 0 2026-03-31T20:38:28.334 INFO:tasks.workunit.client.0.vm03.stdout:root 0 2026-03-31T20:38:28.334 INFO:tasks.workunit.client.0.vm03.stdout:session_timeout 60 2026-03-31T20:38:28.334 INFO:tasks.workunit.client.0.vm03.stdout:session_autoclose 300 2026-03-31T20:38:28.334 INFO:tasks.workunit.client.0.vm03.stdout:max_file_size 1099511627776 2026-03-31T20:38:28.334 INFO:tasks.workunit.client.0.vm03.stdout:max_xattr_size 65536 2026-03-31T20:38:28.334 INFO:tasks.workunit.client.0.vm03.stdout:required_client_features {} 2026-03-31T20:38:28.334 INFO:tasks.workunit.client.0.vm03.stdout:last_failure 0 2026-03-31T20:38:28.334 INFO:tasks.workunit.client.0.vm03.stdout:last_failure_osd_epoch 0 2026-03-31T20:38:28.334 INFO:tasks.workunit.client.0.vm03.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-31T20:38:28.334 INFO:tasks.workunit.client.0.vm03.stdout:max_mds 2 2026-03-31T20:38:28.334 INFO:tasks.workunit.client.0.vm03.stdout:in 2026-03-31T20:38:28.334 INFO:tasks.workunit.client.0.vm03.stdout:up {} 2026-03-31T20:38:28.334 INFO:tasks.workunit.client.0.vm03.stdout:failed 2026-03-31T20:38:28.334 INFO:tasks.workunit.client.0.vm03.stdout:damaged 2026-03-31T20:38:28.334 INFO:tasks.workunit.client.0.vm03.stdout:stopped 2026-03-31T20:38:28.334 INFO:tasks.workunit.client.0.vm03.stdout:data_pools [44] 2026-03-31T20:38:28.334 INFO:tasks.workunit.client.0.vm03.stdout:metadata_pool 45 2026-03-31T20:38:28.334 INFO:tasks.workunit.client.0.vm03.stdout:inline_data disabled 2026-03-31T20:38:28.334 INFO:tasks.workunit.client.0.vm03.stdout:balancer 2026-03-31T20:38:28.334 INFO:tasks.workunit.client.0.vm03.stdout:bal_rank_mask -1 2026-03-31T20:38:28.334 INFO:tasks.workunit.client.0.vm03.stdout:standby_count_wanted 0 2026-03-31T20:38:28.334 INFO:tasks.workunit.client.0.vm03.stdout:qdb_cluster leader: 0 members: 2026-03-31T20:38:28.346 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:993: test_mon_mds: get_mds_gids cephfs 2026-03-31T20:38:28.346 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:825: get_mds_gids: fs_name=cephfs 2026-03-31T20:38:28.346 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:826: get_mds_gids: ceph fs get cephfs --format=json 2026-03-31T20:38:28.346 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:826: get_mds_gids: python3 -c 'import json; import sys; print('\'' '\''.join([m['\''gid'\''].__str__() for m in json.load(sys.stdin)['\''mdsmap'\'']['\''info'\''].values()]))' 2026-03-31T20:38:28.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:996: test_mon_mds: ceph mds metadata 2026-03-31T20:38:28.854 INFO:tasks.workunit.client.0.vm03.stdout:[] 2026-03-31T20:38:28.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:997: test_mon_mds: ceph mds versions 2026-03-31T20:38:29.126 INFO:tasks.workunit.client.0.vm03.stdout:{} 2026-03-31T20:38:29.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:998: test_mon_mds: ceph mds count-metadata os 2026-03-31T20:38:29.399 INFO:tasks.workunit.client.0.vm03.stdout:{} 2026-03-31T20:38:29.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1001: test_mon_mds: mdsmapfile=/tmp/cephtool.sYl/mdsmap.26274 2026-03-31T20:38:29.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1002: test_mon_mds: ceph fs dump -o /tmp/cephtool.sYl/mdsmap.26274 --no-log-to-stderr 2026-03-31T20:38:29.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1002: test_mon_mds: grep epoch 2026-03-31T20:38:29.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1002: test_mon_mds: sed 's/.*epoch //' 2026-03-31T20:38:29.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1002: test_mon_mds: current_epoch=17 2026-03-31T20:38:29.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1003: test_mon_mds: '[' -s /tmp/cephtool.sYl/mdsmap.26274 ']' 2026-03-31T20:38:29.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1004: test_mon_mds: rm /tmp/cephtool.sYl/mdsmap.26274 2026-03-31T20:38:29.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1006: test_mon_mds: ceph osd pool create data2 16 2026-03-31T20:38:30.116 INFO:tasks.workunit.client.0.vm03.stderr:pool 'data2' already exists 2026-03-31T20:38:30.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1007: test_mon_mds: ceph osd pool create data3 16 2026-03-31T20:38:31.125 INFO:tasks.workunit.client.0.vm03.stderr:pool 'data3' already exists 2026-03-31T20:38:31.137 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1008: test_mon_mds: ceph osd dump 2026-03-31T20:38:31.137 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1008: test_mon_mds: grep 'pool.*'\''data2'\''' 2026-03-31T20:38:31.137 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1008: test_mon_mds: awk '{print $2;}' 2026-03-31T20:38:31.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1008: test_mon_mds: data2_pool=46 2026-03-31T20:38:31.339 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1009: test_mon_mds: ceph osd dump 2026-03-31T20:38:31.339 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1009: test_mon_mds: grep 'pool.*'\''data3'\''' 2026-03-31T20:38:31.339 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1009: test_mon_mds: awk '{print $2;}' 2026-03-31T20:38:31.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1009: test_mon_mds: data3_pool=47 2026-03-31T20:38:31.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1010: test_mon_mds: ceph fs add_data_pool cephfs 46 2026-03-31T20:38:32.161 INFO:tasks.workunit.client.0.vm03.stderr:Second attempt of previously successful command failed with EINVAL: RADOS pool 'data2' is already used by filesystem 'cephfs' as a 'data' pool for application 'cephfs' 2026-03-31T20:38:32.161 INFO:tasks.workunit.client.0.vm03.stderr:RADOS pool 'data2' is already used by filesystem 'cephfs' as a 'data' pool for application 'cephfs' 2026-03-31T20:38:32.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1011: test_mon_mds: ceph fs add_data_pool cephfs 47 2026-03-31T20:38:33.161 INFO:tasks.workunit.client.0.vm03.stderr:Second attempt of previously successful command failed with EINVAL: RADOS pool 'data3' is already used by filesystem 'cephfs' as a 'data' pool for application 'cephfs' 2026-03-31T20:38:33.161 INFO:tasks.workunit.client.0.vm03.stderr:RADOS pool 'data3' is already used by filesystem 'cephfs' as a 'data' pool for application 'cephfs' 2026-03-31T20:38:33.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1012: test_mon_mds: ceph fs add_data_pool cephfs 100 2026-03-31T20:38:33.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1012: test_mon_mds: true 2026-03-31T20:38:33.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1013: test_mon_mds: check_response 'Error ENOENT' 2026-03-31T20:38:33.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='Error ENOENT' 2026-03-31T20:38:33.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode= 2026-03-31T20:38:33.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode= 2026-03-31T20:38:33.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' '' -a '!=' ']' 2026-03-31T20:38:33.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'Error ENOENT' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:38:33.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1014: test_mon_mds: ceph fs add_data_pool cephfs foobarbaz 2026-03-31T20:38:33.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1014: test_mon_mds: true 2026-03-31T20:38:33.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1015: test_mon_mds: check_response 'Error ENOENT' 2026-03-31T20:38:33.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='Error ENOENT' 2026-03-31T20:38:33.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode= 2026-03-31T20:38:33.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode= 2026-03-31T20:38:33.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' '' -a '!=' ']' 2026-03-31T20:38:33.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'Error ENOENT' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:38:33.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1016: test_mon_mds: ceph fs rm_data_pool cephfs 46 2026-03-31T20:38:35.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1017: test_mon_mds: ceph fs rm_data_pool cephfs 47 2026-03-31T20:38:37.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1018: test_mon_mds: ceph osd pool delete data2 data2 --yes-i-really-really-mean-it 2026-03-31T20:38:38.182 INFO:tasks.workunit.client.0.vm03.stderr:pool 'data2' does not exist 2026-03-31T20:38:38.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1019: test_mon_mds: ceph osd pool delete data3 data3 --yes-i-really-really-mean-it 2026-03-31T20:38:39.190 INFO:tasks.workunit.client.0.vm03.stderr:pool 'data3' does not exist 2026-03-31T20:38:39.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1020: test_mon_mds: ceph fs set cephfs max_mds 4 2026-03-31T20:38:41.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1021: test_mon_mds: ceph fs set cephfs max_mds 3 2026-03-31T20:38:43.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1022: test_mon_mds: ceph fs set cephfs max_mds 256 2026-03-31T20:38:45.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1023: test_mon_mds: expect_false ceph fs set cephfs max_mds 257 2026-03-31T20:38:45.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:38:45.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph fs set cephfs max_mds 257 2026-03-31T20:38:45.352 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: may not have more than 256 MDS ranks 2026-03-31T20:38:45.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:38:45.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1024: test_mon_mds: ceph fs set cephfs max_mds 4 2026-03-31T20:38:47.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1025: test_mon_mds: ceph fs set cephfs max_mds 256 2026-03-31T20:38:49.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1026: test_mon_mds: expect_false ceph fs set cephfs max_mds 257 2026-03-31T20:38:49.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:38:49.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph fs set cephfs max_mds 257 2026-03-31T20:38:49.398 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: may not have more than 256 MDS ranks 2026-03-31T20:38:49.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:38:49.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1027: test_mon_mds: expect_false ceph fs set cephfs max_mds asdf 2026-03-31T20:38:49.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:38:49.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph fs set cephfs max_mds asdf 2026-03-31T20:38:49.569 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: Expected option value to be integer, got 'asdf' 2026-03-31T20:38:49.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:38:49.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1028: test_mon_mds: expect_false ceph fs set cephfs inline_data true 2026-03-31T20:38:49.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:38:49.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph fs set cephfs inline_data true 2026-03-31T20:38:49.738 INFO:tasks.workunit.client.0.vm03.stderr:Error EPERM: Inline data support is deprecated and will be removed in a future release. Add --yes-i-really-really-mean-it if you are certain you want this enabled. 2026-03-31T20:38:49.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:38:49.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1029: test_mon_mds: ceph fs set cephfs inline_data true --yes-i-really-really-mean-it 2026-03-31T20:38:51.228 INFO:tasks.workunit.client.0.vm03.stderr:inline data enabled 2026-03-31T20:38:51.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1030: test_mon_mds: ceph fs set cephfs inline_data yes --yes-i-really-really-mean-it 2026-03-31T20:38:53.243 INFO:tasks.workunit.client.0.vm03.stderr:inline data enabled 2026-03-31T20:38:53.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1031: test_mon_mds: ceph fs set cephfs inline_data 1 --yes-i-really-really-mean-it 2026-03-31T20:38:55.257 INFO:tasks.workunit.client.0.vm03.stderr:inline data enabled 2026-03-31T20:38:55.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1032: test_mon_mds: expect_false ceph fs set cephfs inline_data --yes-i-really-really-mean-it 2026-03-31T20:38:55.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:38:55.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph fs set cephfs inline_data --yes-i-really-really-mean-it 2026-03-31T20:38:55.432 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: missing required parameter val() 2026-03-31T20:38:55.432 INFO:tasks.workunit.client.0.vm03.stderr:fs set [--yes-i-really-mean-it] [--yes-i-really-really-mean-it] : set fs parameter to 2026-03-31T20:38:55.432 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:38:55.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:38:55.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1033: test_mon_mds: ceph fs set cephfs inline_data false 2026-03-31T20:38:57.264 INFO:tasks.workunit.client.0.vm03.stderr:inline data disabled 2026-03-31T20:38:57.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1034: test_mon_mds: ceph fs set cephfs inline_data no 2026-03-31T20:38:59.280 INFO:tasks.workunit.client.0.vm03.stderr:inline data disabled 2026-03-31T20:38:59.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1035: test_mon_mds: ceph fs set cephfs inline_data 0 2026-03-31T20:39:01.292 INFO:tasks.workunit.client.0.vm03.stderr:inline data disabled 2026-03-31T20:39:01.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1036: test_mon_mds: expect_false ceph fs set cephfs inline_data asdf 2026-03-31T20:39:01.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:39:01.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph fs set cephfs inline_data asdf 2026-03-31T20:39:01.469 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: value must be false|no|0 or true|yes|1 2026-03-31T20:39:01.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:39:01.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1037: test_mon_mds: ceph fs set cephfs max_file_size 1048576 2026-03-31T20:39:03.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1038: test_mon_mds: expect_false ceph fs set cephfs max_file_size 123asdf 2026-03-31T20:39:03.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:39:03.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph fs set cephfs max_file_size 123asdf 2026-03-31T20:39:03.481 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: max_file_size requires an integer value 2026-03-31T20:39:03.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:39:03.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1040: test_mon_mds: expect_false ceph fs set cephfs allow_new_snaps 2026-03-31T20:39:03.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:39:03.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph fs set cephfs allow_new_snaps 2026-03-31T20:39:03.649 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: missing required parameter val() 2026-03-31T20:39:03.649 INFO:tasks.workunit.client.0.vm03.stderr:fs set [--yes-i-really-mean-it] [--yes-i-really-really-mean-it] : set fs parameter to 2026-03-31T20:39:03.649 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:39:03.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:39:03.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1041: test_mon_mds: ceph fs set cephfs allow_new_snaps true 2026-03-31T20:39:05.319 INFO:tasks.workunit.client.0.vm03.stderr:enabled new snapshots 2026-03-31T20:39:05.332 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1042: test_mon_mds: ceph fs set cephfs allow_new_snaps 0 2026-03-31T20:39:07.326 INFO:tasks.workunit.client.0.vm03.stderr:disabled new snapshots 2026-03-31T20:39:07.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1043: test_mon_mds: ceph fs set cephfs allow_new_snaps false 2026-03-31T20:39:09.340 INFO:tasks.workunit.client.0.vm03.stderr:disabled new snapshots 2026-03-31T20:39:09.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1044: test_mon_mds: ceph fs set cephfs allow_new_snaps no 2026-03-31T20:39:11.357 INFO:tasks.workunit.client.0.vm03.stderr:disabled new snapshots 2026-03-31T20:39:11.369 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1045: test_mon_mds: expect_false ceph fs set cephfs allow_new_snaps taco 2026-03-31T20:39:11.369 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:39:11.369 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph fs set cephfs allow_new_snaps taco 2026-03-31T20:39:11.531 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: value must be false|no|0 or true|yes|1 2026-03-31T20:39:11.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:39:11.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1049: test_mon_mds: ceph osd pool create mds-ec-pool 16 16 erasure 2026-03-31T20:39:12.417 INFO:tasks.workunit.client.0.vm03.stderr:pool 'mds-ec-pool' already exists 2026-03-31T20:39:12.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1050: test_mon_mds: set +e 2026-03-31T20:39:12.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1051: test_mon_mds: ceph fs add_data_pool cephfs mds-ec-pool 2026-03-31T20:39:12.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1052: test_mon_mds: check_response erasure-code 22 22 2026-03-31T20:39:12.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string=erasure-code 2026-03-31T20:39:12.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode=22 2026-03-31T20:39:12.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode=22 2026-03-31T20:39:12.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' 22 -a 22 '!=' 22 ']' 2026-03-31T20:39:12.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- erasure-code /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:39:12.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1053: test_mon_mds: set -e 2026-03-31T20:39:12.597 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1054: test_mon_mds: ceph osd dump 2026-03-31T20:39:12.597 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1054: test_mon_mds: grep 'pool.* '\''mds-ec-pool' 2026-03-31T20:39:12.597 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1054: test_mon_mds: awk '{print $2;}' 2026-03-31T20:39:12.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1054: test_mon_mds: ec_poolnum=48 2026-03-31T20:39:12.798 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1055: test_mon_mds: ceph osd dump 2026-03-31T20:39:12.798 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1055: test_mon_mds: grep 'pool.* '\''fs_data' 2026-03-31T20:39:12.798 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1055: test_mon_mds: awk '{print $2;}' 2026-03-31T20:39:13.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1055: test_mon_mds: data_poolnum=44 2026-03-31T20:39:13.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1056: test_mon_mds: ceph osd dump 2026-03-31T20:39:13.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1056: test_mon_mds: grep 'pool.* '\''fs_metadata' 2026-03-31T20:39:13.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1056: test_mon_mds: awk '{print $2;}' 2026-03-31T20:39:13.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1056: test_mon_mds: metadata_poolnum=45 2026-03-31T20:39:13.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1058: test_mon_mds: fail_all_mds cephfs 2026-03-31T20:39:13.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:831: fail_all_mds: fs_name=cephfs 2026-03-31T20:39:13.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:832: fail_all_mds: ceph fs set cephfs cluster_down true 2026-03-31T20:39:15.396 INFO:tasks.workunit.client.0.vm03.stderr:cephfs marked not joinable; MDS cannot join as newly active. WARNING: cluster_down flag is deprecated and will be removed in a future version. Please use "joinable". 2026-03-31T20:39:15.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:833: fail_all_mds: get_mds_gids cephfs 2026-03-31T20:39:15.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:825: get_mds_gids: fs_name=cephfs 2026-03-31T20:39:15.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:826: get_mds_gids: ceph fs get cephfs --format=json 2026-03-31T20:39:15.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:826: get_mds_gids: python3 -c 'import json; import sys; print('\'' '\''.join([m['\''gid'\''].__str__() for m in json.load(sys.stdin)['\''mdsmap'\'']['\''info'\''].values()]))' 2026-03-31T20:39:15.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:833: fail_all_mds: mds_gids= 2026-03-31T20:39:15.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:837: fail_all_mds: check_mds_active cephfs 2026-03-31T20:39:15.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:804: check_mds_active: fs_name=cephfs 2026-03-31T20:39:15.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:805: check_mds_active: ceph fs get cephfs 2026-03-31T20:39:15.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:805: check_mds_active: grep active 2026-03-31T20:39:15.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1060: test_mon_mds: set +e 2026-03-31T20:39:15.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1062: test_mon_mds: expect_false ceph mds rmfailed 0 2026-03-31T20:39:15.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:39:15.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph mds rmfailed 0 2026-03-31T20:39:16.061 INFO:tasks.workunit.client.0.vm03.stderr:Error EPERM: WARNING: this can make your filesystem inaccessible! Add --yes-i-really-mean-it if you are sure you wish to continue. 2026-03-31T20:39:16.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:39:16.065 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1063: test_mon_mds: ceph mds rmfailed 0 --yes-i-really-mean-it 2026-03-31T20:39:16.239 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: Rank '0' not foundinvalid role '0' 2026-03-31T20:39:16.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1064: test_mon_mds: set -e 2026-03-31T20:39:16.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1067: test_mon_mds: expect_false ceph fs new cephfs 45 44 --yes-i-really-mean-it 2026-03-31T20:39:16.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1070: test_mon_mds: ceph fs reset cephfs --yes-i-really-mean-it 2026-03-31T20:39:18.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1073: test_mon_mds: ceph osd pool create fs_metadata2 16 2026-03-31T20:39:19.481 INFO:tasks.workunit.client.0.vm03.stderr:pool 'fs_metadata2' already exists 2026-03-31T20:39:19.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1074: test_mon_mds: ceph osd pool create fs_data2 16 2026-03-31T20:39:20.503 INFO:tasks.workunit.client.0.vm03.stderr:pool 'fs_data2' already exists 2026-03-31T20:39:20.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1075: test_mon_mds: set +e 2026-03-31T20:39:20.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1076: test_mon_mds: expect_false ceph fs new cephfs2 fs_metadata2 fs_data2 2026-03-31T20:39:20.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:39:20.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph fs new cephfs2 fs_metadata2 fs_data2 2026-03-31T20:39:21.445 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:39:21.445+0000 7f870ab89640 -1 log_channel(cluster) log [ERR] : Health check update: 2 filesystems are offline (MDS_ALL_DOWN) 2026-03-31T20:39:22.474 INFO:tasks.workunit.client.0.vm03.stderr:filesystem 'cephfs2' already exists 2026-03-31T20:39:22.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 1 2026-03-31T20:39:22.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1077: test_mon_mds: set -e 2026-03-31T20:39:22.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1080: test_mon_mds: ceph fs flag set enable_multiple true --yes-i-really-mean-it 2026-03-31T20:39:24.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1081: test_mon_mds: ceph fs new cephfs2 fs_metadata2 fs_data2 2026-03-31T20:39:26.503 INFO:tasks.workunit.client.0.vm03.stderr:filesystem 'cephfs2' already exists 2026-03-31T20:39:26.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1084: test_mon_mds: fail_all_mds cephfs2 2026-03-31T20:39:26.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:831: fail_all_mds: fs_name=cephfs2 2026-03-31T20:39:26.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:832: fail_all_mds: ceph fs set cephfs2 cluster_down true 2026-03-31T20:39:28.516 INFO:tasks.workunit.client.0.vm03.stderr:cephfs2 marked not joinable; MDS cannot join as newly active. WARNING: cluster_down flag is deprecated and will be removed in a future version. Please use "joinable". 2026-03-31T20:39:28.529 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:833: fail_all_mds: get_mds_gids cephfs2 2026-03-31T20:39:28.529 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:825: get_mds_gids: fs_name=cephfs2 2026-03-31T20:39:28.529 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:826: get_mds_gids: ceph fs get cephfs2 --format=json 2026-03-31T20:39:28.529 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:826: get_mds_gids: python3 -c 'import json; import sys; print('\'' '\''.join([m['\''gid'\''].__str__() for m in json.load(sys.stdin)['\''mdsmap'\'']['\''info'\''].values()]))' 2026-03-31T20:39:28.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:833: fail_all_mds: mds_gids= 2026-03-31T20:39:28.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:837: fail_all_mds: check_mds_active cephfs2 2026-03-31T20:39:28.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:804: check_mds_active: fs_name=cephfs2 2026-03-31T20:39:28.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:805: check_mds_active: ceph fs get cephfs2 2026-03-31T20:39:28.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:805: check_mds_active: grep active 2026-03-31T20:39:29.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1085: test_mon_mds: ceph fs rm cephfs2 --yes-i-really-mean-it 2026-03-31T20:39:29.514 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:39:29.513+0000 7f870ab89640 -1 log_channel(cluster) log [ERR] : Health check update: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-31T20:39:30.527 INFO:tasks.workunit.client.0.vm03.stderr:filesystem 'cephfs2' does not exist 2026-03-31T20:39:30.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1086: test_mon_mds: ceph osd pool delete fs_metadata2 fs_metadata2 --yes-i-really-really-mean-it 2026-03-31T20:39:31.594 INFO:tasks.workunit.client.0.vm03.stderr:pool 'fs_metadata2' does not exist 2026-03-31T20:39:31.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1087: test_mon_mds: ceph osd pool delete fs_data2 fs_data2 --yes-i-really-really-mean-it 2026-03-31T20:39:32.603 INFO:tasks.workunit.client.0.vm03.stderr:pool 'fs_data2' does not exist 2026-03-31T20:39:32.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1089: test_mon_mds: fail_all_mds cephfs 2026-03-31T20:39:32.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:831: fail_all_mds: fs_name=cephfs 2026-03-31T20:39:32.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:832: fail_all_mds: ceph fs set cephfs cluster_down true 2026-03-31T20:39:34.556 INFO:tasks.workunit.client.0.vm03.stderr:cephfs marked not joinable; MDS cannot join as newly active. WARNING: cluster_down flag is deprecated and will be removed in a future version. Please use "joinable". 2026-03-31T20:39:34.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:833: fail_all_mds: get_mds_gids cephfs 2026-03-31T20:39:34.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:825: get_mds_gids: fs_name=cephfs 2026-03-31T20:39:34.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:826: get_mds_gids: ceph fs get cephfs --format=json 2026-03-31T20:39:34.570 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:826: get_mds_gids: python3 -c 'import json; import sys; print('\'' '\''.join([m['\''gid'\''].__str__() for m in json.load(sys.stdin)['\''mdsmap'\'']['\''info'\''].values()]))' 2026-03-31T20:39:34.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:833: fail_all_mds: mds_gids= 2026-03-31T20:39:34.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:837: fail_all_mds: check_mds_active cephfs 2026-03-31T20:39:34.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:804: check_mds_active: fs_name=cephfs 2026-03-31T20:39:34.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:805: check_mds_active: ceph fs get cephfs 2026-03-31T20:39:34.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:805: check_mds_active: grep active 2026-03-31T20:39:35.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1092: test_mon_mds: ceph fs rm cephfs --yes-i-really-mean-it 2026-03-31T20:39:36.586 INFO:tasks.workunit.client.0.vm03.stderr:filesystem 'cephfs' does not exist 2026-03-31T20:39:36.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1094: test_mon_mds: set +e 2026-03-31T20:39:36.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1095: test_mon_mds: ceph fs new cephfs fs_metadata mds-ec-pool --force 2026-03-31T20:39:36.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1096: test_mon_mds: check_response erasure-code 22 22 2026-03-31T20:39:36.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string=erasure-code 2026-03-31T20:39:36.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode=22 2026-03-31T20:39:36.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode=22 2026-03-31T20:39:36.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' 22 -a 22 '!=' 22 ']' 2026-03-31T20:39:36.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- erasure-code /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:39:36.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1097: test_mon_mds: ceph fs new cephfs mds-ec-pool fs_data 2026-03-31T20:39:36.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1098: test_mon_mds: check_response 'already used by filesystem' 22 22 2026-03-31T20:39:36.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='already used by filesystem' 2026-03-31T20:39:36.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode=22 2026-03-31T20:39:36.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode=22 2026-03-31T20:39:36.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' 22 -a 22 '!=' 22 ']' 2026-03-31T20:39:36.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'already used by filesystem' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:39:36.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1099: test_mon_mds: ceph fs new cephfs mds-ec-pool fs_data --force 2026-03-31T20:39:37.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1100: test_mon_mds: check_response erasure-code 22 22 2026-03-31T20:39:37.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string=erasure-code 2026-03-31T20:39:37.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode=22 2026-03-31T20:39:37.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode=22 2026-03-31T20:39:37.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' 22 -a 22 '!=' 22 ']' 2026-03-31T20:39:37.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- erasure-code /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:39:37.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1101: test_mon_mds: ceph fs new cephfs mds-ec-pool mds-ec-pool 2026-03-31T20:39:37.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1102: test_mon_mds: check_response erasure-code 22 22 2026-03-31T20:39:37.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string=erasure-code 2026-03-31T20:39:37.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode=22 2026-03-31T20:39:37.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode=22 2026-03-31T20:39:37.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' 22 -a 22 '!=' 22 ']' 2026-03-31T20:39:37.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- erasure-code /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:39:37.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1103: test_mon_mds: set -e 2026-03-31T20:39:37.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1106: test_mon_mds: ceph osd pool create mds-tier 2 2026-03-31T20:39:37.648 INFO:tasks.workunit.client.0.vm03.stderr:pool 'mds-tier' already exists 2026-03-31T20:39:37.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1107: test_mon_mds: ceph osd tier add mds-ec-pool mds-tier 2026-03-31T20:39:38.665 INFO:tasks.workunit.client.0.vm03.stderr:pool 'mds-tier' is now (or already was) a tier of 'mds-ec-pool' 2026-03-31T20:39:38.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1108: test_mon_mds: ceph osd tier set-overlay mds-ec-pool mds-tier 2026-03-31T20:39:39.668 INFO:tasks.workunit.client.0.vm03.stderr:overlay for 'mds-ec-pool' is now (or already was) 'mds-tier' 2026-03-31T20:39:39.681 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1109: test_mon_mds: ceph osd dump 2026-03-31T20:39:39.681 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1109: test_mon_mds: grep 'pool.* '\''mds-tier' 2026-03-31T20:39:39.681 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1109: test_mon_mds: awk '{print $2;}' 2026-03-31T20:39:39.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1109: test_mon_mds: tier_poolnum=51 2026-03-31T20:39:39.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1112: test_mon_mds: ceph osd tier cache-mode mds-tier readonly --yes-i-really-mean-it 2026-03-31T20:39:40.681 INFO:tasks.workunit.client.0.vm03.stderr:set cache-mode for pool 'mds-tier' to readonly 2026-03-31T20:39:40.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1113: test_mon_mds: set +e 2026-03-31T20:39:40.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1114: test_mon_mds: ceph fs new cephfs fs_metadata mds-ec-pool --force 2026-03-31T20:39:40.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1115: test_mon_mds: check_response 'has a write tier (mds-tier) that is configured to forward' 22 22 2026-03-31T20:39:40.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='has a write tier (mds-tier) that is configured to forward' 2026-03-31T20:39:40.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode=22 2026-03-31T20:39:40.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode=22 2026-03-31T20:39:40.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' 22 -a 22 '!=' 22 ']' 2026-03-31T20:39:40.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'has a write tier (mds-tier) that is configured to forward' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:39:40.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1116: test_mon_mds: set -e 2026-03-31T20:39:40.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1119: test_mon_mds: ceph osd tier cache-mode mds-tier writeback 2026-03-31T20:39:41.688 INFO:tasks.workunit.client.0.vm03.stderr:set cache-mode for pool 'mds-tier' to writeback 2026-03-31T20:39:41.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1120: test_mon_mds: ceph fs new cephfs fs_metadata mds-ec-pool --force 2026-03-31T20:39:42.630 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:39:42.629+0000 7f870ab89640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-31T20:39:43.639 INFO:tasks.workunit.client.0.vm03.stderr:filesystem 'cephfs' already exists 2026-03-31T20:39:43.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1124: test_mon_mds: set +e 2026-03-31T20:39:43.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1125: test_mon_mds: ceph osd tier remove-overlay mds-ec-pool 2026-03-31T20:39:43.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1126: test_mon_mds: check_response 'in use by CephFS' 16 16 2026-03-31T20:39:43.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='in use by CephFS' 2026-03-31T20:39:43.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode=16 2026-03-31T20:39:43.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode=16 2026-03-31T20:39:43.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' 16 -a 16 '!=' 16 ']' 2026-03-31T20:39:43.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'in use by CephFS' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:39:43.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1127: test_mon_mds: ceph osd tier remove mds-ec-pool mds-tier 2026-03-31T20:39:43.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1128: test_mon_mds: check_response 'in use by CephFS' 16 16 2026-03-31T20:39:43.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='in use by CephFS' 2026-03-31T20:39:43.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode=16 2026-03-31T20:39:43.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode=16 2026-03-31T20:39:43.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' 16 -a 16 '!=' 16 ']' 2026-03-31T20:39:43.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'in use by CephFS' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:39:43.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1129: test_mon_mds: set -e 2026-03-31T20:39:43.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1131: test_mon_mds: fail_all_mds cephfs 2026-03-31T20:39:43.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:831: fail_all_mds: fs_name=cephfs 2026-03-31T20:39:43.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:832: fail_all_mds: ceph fs set cephfs cluster_down true 2026-03-31T20:39:45.664 INFO:tasks.workunit.client.0.vm03.stderr:cephfs marked not joinable; MDS cannot join as newly active. WARNING: cluster_down flag is deprecated and will be removed in a future version. Please use "joinable". 2026-03-31T20:39:45.677 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:833: fail_all_mds: get_mds_gids cephfs 2026-03-31T20:39:45.677 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:825: get_mds_gids: fs_name=cephfs 2026-03-31T20:39:45.677 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:826: get_mds_gids: ceph fs get cephfs --format=json 2026-03-31T20:39:45.677 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:826: get_mds_gids: python3 -c 'import json; import sys; print('\'' '\''.join([m['\''gid'\''].__str__() for m in json.load(sys.stdin)['\''mdsmap'\'']['\''info'\''].values()]))' 2026-03-31T20:39:45.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:833: fail_all_mds: mds_gids= 2026-03-31T20:39:45.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:837: fail_all_mds: check_mds_active cephfs 2026-03-31T20:39:45.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:804: check_mds_active: fs_name=cephfs 2026-03-31T20:39:45.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:805: check_mds_active: ceph fs get cephfs 2026-03-31T20:39:45.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:805: check_mds_active: grep active 2026-03-31T20:39:46.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1132: test_mon_mds: ceph fs rm cephfs --yes-i-really-mean-it 2026-03-31T20:39:47.673 INFO:tasks.workunit.client.0.vm03.stderr:filesystem 'cephfs' does not exist 2026-03-31T20:39:47.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1135: test_mon_mds: set +e 2026-03-31T20:39:47.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1136: test_mon_mds: ceph fs new cephfs fs_metadata mds-tier --force 2026-03-31T20:39:47.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1137: test_mon_mds: check_response 'in use as a cache tier' 22 22 2026-03-31T20:39:47.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='in use as a cache tier' 2026-03-31T20:39:47.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode=22 2026-03-31T20:39:47.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode=22 2026-03-31T20:39:47.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' 22 -a 22 '!=' 22 ']' 2026-03-31T20:39:47.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'in use as a cache tier' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:39:47.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1138: test_mon_mds: ceph fs new cephfs mds-tier fs_data 2026-03-31T20:39:48.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1139: test_mon_mds: check_response 'already used by filesystem' 22 22 2026-03-31T20:39:48.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='already used by filesystem' 2026-03-31T20:39:48.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode=22 2026-03-31T20:39:48.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode=22 2026-03-31T20:39:48.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' 22 -a 22 '!=' 22 ']' 2026-03-31T20:39:48.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'already used by filesystem' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:39:48.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1140: test_mon_mds: ceph fs new cephfs mds-tier fs_data --force 2026-03-31T20:39:48.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1141: test_mon_mds: check_response 'in use as a cache tier' 22 22 2026-03-31T20:39:48.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='in use as a cache tier' 2026-03-31T20:39:48.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode=22 2026-03-31T20:39:48.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode=22 2026-03-31T20:39:48.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' 22 -a 22 '!=' 22 ']' 2026-03-31T20:39:48.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'in use as a cache tier' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:39:48.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1142: test_mon_mds: ceph fs new cephfs mds-tier mds-tier 2026-03-31T20:39:48.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1143: test_mon_mds: check_response 'already used by filesystem' 22 22 2026-03-31T20:39:48.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='already used by filesystem' 2026-03-31T20:39:48.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode=22 2026-03-31T20:39:48.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode=22 2026-03-31T20:39:48.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' 22 -a 22 '!=' 22 ']' 2026-03-31T20:39:48.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'already used by filesystem' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:39:48.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1144: test_mon_mds: ceph fs new cephfs mds-tier mds-tier --force 2026-03-31T20:39:48.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1145: test_mon_mds: check_response 'in use as a cache tier' 22 22 2026-03-31T20:39:48.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='in use as a cache tier' 2026-03-31T20:39:48.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode=22 2026-03-31T20:39:48.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode=22 2026-03-31T20:39:48.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' 22 -a 22 '!=' 22 ']' 2026-03-31T20:39:48.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'in use as a cache tier' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:39:48.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1146: test_mon_mds: set -e 2026-03-31T20:39:48.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1149: test_mon_mds: ceph osd tier remove-overlay mds-ec-pool 2026-03-31T20:39:48.742 INFO:tasks.workunit.client.0.vm03.stderr:there is now (or already was) no overlay for 'mds-ec-pool' 2026-03-31T20:39:48.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1150: test_mon_mds: ceph osd tier remove mds-ec-pool mds-tier 2026-03-31T20:39:49.747 INFO:tasks.workunit.client.0.vm03.stderr:pool 'mds-tier' is now (or already was) not a tier of 'mds-ec-pool' 2026-03-31T20:39:49.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1153: test_mon_mds: ceph fs new cephfs fs_metadata mds-tier --force 2026-03-31T20:39:50.702 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:39:50.701+0000 7f870ab89640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-31T20:39:51.715 INFO:tasks.workunit.client.0.vm03.stderr:filesystem 'cephfs' already exists 2026-03-31T20:39:51.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1157: test_mon_mds: set +e 2026-03-31T20:39:51.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1158: test_mon_mds: ceph osd tier add mds-ec-pool mds-tier 2026-03-31T20:39:51.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1159: test_mon_mds: check_response 'in use by CephFS' 16 16 2026-03-31T20:39:51.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='in use by CephFS' 2026-03-31T20:39:51.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode=16 2026-03-31T20:39:51.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode=16 2026-03-31T20:39:51.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' 16 -a 16 '!=' 16 ']' 2026-03-31T20:39:51.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'in use by CephFS' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:39:51.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1160: test_mon_mds: set -e 2026-03-31T20:39:51.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1162: test_mon_mds: fail_all_mds cephfs 2026-03-31T20:39:51.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:831: fail_all_mds: fs_name=cephfs 2026-03-31T20:39:51.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:832: fail_all_mds: ceph fs set cephfs cluster_down true 2026-03-31T20:39:53.731 INFO:tasks.workunit.client.0.vm03.stderr:cephfs marked not joinable; MDS cannot join as newly active. WARNING: cluster_down flag is deprecated and will be removed in a future version. Please use "joinable". 2026-03-31T20:39:53.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:833: fail_all_mds: get_mds_gids cephfs 2026-03-31T20:39:53.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:825: get_mds_gids: fs_name=cephfs 2026-03-31T20:39:53.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:826: get_mds_gids: ceph fs get cephfs --format=json 2026-03-31T20:39:53.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:826: get_mds_gids: python3 -c 'import json; import sys; print('\'' '\''.join([m['\''gid'\''].__str__() for m in json.load(sys.stdin)['\''mdsmap'\'']['\''info'\''].values()]))' 2026-03-31T20:39:53.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:833: fail_all_mds: mds_gids= 2026-03-31T20:39:53.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:837: fail_all_mds: check_mds_active cephfs 2026-03-31T20:39:53.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:804: check_mds_active: fs_name=cephfs 2026-03-31T20:39:53.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:805: check_mds_active: ceph fs get cephfs 2026-03-31T20:39:53.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:805: check_mds_active: grep active 2026-03-31T20:39:54.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1163: test_mon_mds: ceph fs rm cephfs --yes-i-really-mean-it 2026-03-31T20:39:55.744 INFO:tasks.workunit.client.0.vm03.stderr:filesystem 'cephfs' does not exist 2026-03-31T20:39:55.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1167: test_mon_mds: ceph osd pool set mds-ec-pool allow_ec_overwrites true 2026-03-31T20:39:57.764 INFO:tasks.workunit.client.0.vm03.stderr:set pool 48 allow_ec_overwrites to true 2026-03-31T20:39:57.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1168: test_mon_mds: ceph fs new cephfs fs_metadata mds-ec-pool --force 2026-03-31T20:39:58.759 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:39:58.756+0000 7f870ab89640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-31T20:39:59.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1169: test_mon_mds: fail_all_mds cephfs 2026-03-31T20:39:59.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:831: fail_all_mds: fs_name=cephfs 2026-03-31T20:39:59.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:832: fail_all_mds: ceph fs set cephfs cluster_down true 2026-03-31T20:39:59.998 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:39:59.996+0000 7f870ab89640 -1 log_channel(cluster) log [ERR] : overall HEALTH_ERR 1 auth entities have invalid capabilities; 1 filesystem is offline; 1 filesystem is online with fewer MDS than max_mds 2026-03-31T20:40:01.790 INFO:tasks.workunit.client.0.vm03.stderr:cephfs marked not joinable; MDS cannot join as newly active. WARNING: cluster_down flag is deprecated and will be removed in a future version. Please use "joinable". 2026-03-31T20:40:01.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:833: fail_all_mds: get_mds_gids cephfs 2026-03-31T20:40:01.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:825: get_mds_gids: fs_name=cephfs 2026-03-31T20:40:01.804 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:826: get_mds_gids: ceph fs get cephfs --format=json 2026-03-31T20:40:01.804 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:826: get_mds_gids: python3 -c 'import json; import sys; print('\'' '\''.join([m['\''gid'\''].__str__() for m in json.load(sys.stdin)['\''mdsmap'\'']['\''info'\''].values()]))' 2026-03-31T20:40:02.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:833: fail_all_mds: mds_gids= 2026-03-31T20:40:02.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:837: fail_all_mds: check_mds_active cephfs 2026-03-31T20:40:02.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:804: check_mds_active: fs_name=cephfs 2026-03-31T20:40:02.044 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:805: check_mds_active: ceph fs get cephfs 2026-03-31T20:40:02.044 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:805: check_mds_active: grep active 2026-03-31T20:40:02.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1170: test_mon_mds: ceph fs rm cephfs --yes-i-really-mean-it 2026-03-31T20:40:03.803 INFO:tasks.workunit.client.0.vm03.stderr:filesystem 'cephfs' does not exist 2026-03-31T20:40:03.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1173: test_mon_mds: set +e 2026-03-31T20:40:03.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1174: test_mon_mds: ceph fs new cephfs mds-ec-pool fs_data 2026-03-31T20:40:03.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1175: test_mon_mds: check_response 'already used by filesystem' 22 22 2026-03-31T20:40:03.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string='already used by filesystem' 2026-03-31T20:40:03.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode=22 2026-03-31T20:40:03.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode=22 2026-03-31T20:40:03.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' 22 -a 22 '!=' 22 ']' 2026-03-31T20:40:03.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- 'already used by filesystem' /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:40:03.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1176: test_mon_mds: ceph fs new cephfs mds-ec-pool fs_data --force 2026-03-31T20:40:04.148 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1177: test_mon_mds: check_response erasure-code 22 22 2026-03-31T20:40:04.148 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:103: check_response: expected_string=erasure-code 2026-03-31T20:40:04.148 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:104: check_response: retcode=22 2026-03-31T20:40:04.148 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:105: check_response: expected_retcode=22 2026-03-31T20:40:04.148 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:106: check_response: '[' 22 -a 22 '!=' 22 ']' 2026-03-31T20:40:04.148 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:111: check_response: grep --quiet -- erasure-code /tmp/cephtool.sYl/test_invalid.NL9 2026-03-31T20:40:04.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1178: test_mon_mds: set -e 2026-03-31T20:40:04.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1180: test_mon_mds: ceph osd pool delete mds-ec-pool mds-ec-pool --yes-i-really-really-mean-it 2026-03-31T20:40:04.870 INFO:tasks.workunit.client.0.vm03.stderr:pool 'mds-ec-pool' does not exist 2026-03-31T20:40:04.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1183: test_mon_mds: ceph fs new cephfs fs_metadata fs_data --force 2026-03-31T20:40:05.811 INFO:tasks.ceph.mon.a.vm03.stderr:2026-03-31T20:40:05.808+0000 7f870ab89640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-31T20:40:06.827 INFO:tasks.workunit.client.0.vm03.stderr:filesystem 'cephfs' already exists 2026-03-31T20:40:06.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1186: test_mon_mds: ceph osd tier add fs_metadata mds-tier 2026-03-31T20:40:07.892 INFO:tasks.workunit.client.0.vm03.stderr:pool 'mds-tier' is now (or already was) a tier of 'fs_metadata' 2026-03-31T20:40:07.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1187: test_mon_mds: ceph osd tier cache-mode mds-tier writeback 2026-03-31T20:40:08.899 INFO:tasks.workunit.client.0.vm03.stderr:set cache-mode for pool 'mds-tier' to writeback 2026-03-31T20:40:08.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1188: test_mon_mds: ceph osd tier set-overlay fs_metadata mds-tier 2026-03-31T20:40:09.904 INFO:tasks.workunit.client.0.vm03.stderr:overlay for 'fs_metadata' is now (or already was) 'mds-tier' 2026-03-31T20:40:09.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1192: test_mon_mds: ceph osd tier cache-mode mds-tier proxy 2026-03-31T20:40:10.911 INFO:tasks.workunit.client.0.vm03.stderr:set cache-mode for pool 'mds-tier' to proxy 2026-03-31T20:40:10.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1193: test_mon_mds: ceph osd tier remove-overlay fs_metadata 2026-03-31T20:40:11.919 INFO:tasks.workunit.client.0.vm03.stderr:there is now (or already was) no overlay for 'fs_metadata' 2026-03-31T20:40:11.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1194: test_mon_mds: ceph osd tier remove fs_metadata mds-tier 2026-03-31T20:40:12.927 INFO:tasks.workunit.client.0.vm03.stderr:pool 'mds-tier' is now (or already was) not a tier of 'fs_metadata' 2026-03-31T20:40:12.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1195: test_mon_mds: ceph osd pool delete mds-tier mds-tier --yes-i-really-really-mean-it 2026-03-31T20:40:13.927 INFO:tasks.workunit.client.0.vm03.stderr:pool 'mds-tier' does not exist 2026-03-31T20:40:13.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1198: test_mon_mds: fail_all_mds cephfs 2026-03-31T20:40:13.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:831: fail_all_mds: fs_name=cephfs 2026-03-31T20:40:13.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:832: fail_all_mds: ceph fs set cephfs cluster_down true 2026-03-31T20:40:15.888 INFO:tasks.workunit.client.0.vm03.stderr:cephfs marked not joinable; MDS cannot join as newly active. WARNING: cluster_down flag is deprecated and will be removed in a future version. Please use "joinable". 2026-03-31T20:40:15.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:833: fail_all_mds: get_mds_gids cephfs 2026-03-31T20:40:15.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:825: get_mds_gids: fs_name=cephfs 2026-03-31T20:40:15.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:826: get_mds_gids: ceph fs get cephfs --format=json 2026-03-31T20:40:15.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:826: get_mds_gids: python3 -c 'import json; import sys; print('\'' '\''.join([m['\''gid'\''].__str__() for m in json.load(sys.stdin)['\''mdsmap'\'']['\''info'\''].values()]))' 2026-03-31T20:40:16.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:833: fail_all_mds: mds_gids= 2026-03-31T20:40:16.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:837: fail_all_mds: check_mds_active cephfs 2026-03-31T20:40:16.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:804: check_mds_active: fs_name=cephfs 2026-03-31T20:40:16.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:805: check_mds_active: ceph fs get cephfs 2026-03-31T20:40:16.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:805: check_mds_active: grep active 2026-03-31T20:40:16.373 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1199: test_mon_mds: ceph fs rm cephfs --yes-i-really-mean-it 2026-03-31T20:40:17.909 INFO:tasks.workunit.client.0.vm03.stderr:filesystem 'cephfs' does not exist 2026-03-31T20:40:17.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1203: test_mon_mds: ceph mds stat 2026-03-31T20:40:18.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1209: test_mon_mds: ceph osd pool delete fs_data fs_data --yes-i-really-really-mean-it 2026-03-31T20:40:18.973 INFO:tasks.workunit.client.0.vm03.stderr:pool 'fs_data' does not exist 2026-03-31T20:40:18.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1210: test_mon_mds: ceph osd pool delete fs_metadata fs_metadata --yes-i-really-really-mean-it 2026-03-31T20:40:19.974 INFO:tasks.workunit.client.0.vm03.stderr:pool 'fs_metadata' does not exist 2026-03-31T20:40:19.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:40:20.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mon_mds_metadata 2026-03-31T20:40:20.189 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1215: test_mon_mds_metadata: ceph tell 'mon.*' version 2026-03-31T20:40:20.189 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1215: test_mon_mds_metadata: grep -c version 2026-03-31T20:40:20.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1215: test_mon_mds_metadata: local nmons=3 2026-03-31T20:40:20.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1216: test_mon_mds_metadata: test 3 -gt 0 2026-03-31T20:40:20.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1218: test_mon_mds_metadata: ceph fs dump 2026-03-31T20:40:20.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1219: test_mon_mds_metadata: sed -nEe 's/^([0-9]+):.*'\''([a-z])'\'' mds\.([0-9]+)\..*/\1 \2 \3/p' 2026-03-31T20:40:20.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1220: test_mon_mds_metadata: read gid id rank 2026-03-31T20:40:20.501 INFO:tasks.workunit.client.0.vm03.stderr:dumped fsmap epoch 97 2026-03-31T20:40:20.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1229: test_mon_mds_metadata: expect_false ceph mds metadata UNKNOWN 2026-03-31T20:40:20.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:40:20.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph mds metadata UNKNOWN 2026-03-31T20:40:20.685 INFO:tasks.workunit.client.0.vm03.stdout:{} 2026-03-31T20:40:20.685 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: MDS named 'UNKNOWN' does not exist, or is not up 2026-03-31T20:40:20.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:40:20.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:40:20.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mds_tell_help_command 2026-03-31T20:40:20.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2828: test_mds_tell_help_command: local FS_NAME=cephfs 2026-03-31T20:40:20.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2829: test_mds_tell_help_command: mds_exists 2026-03-31T20:40:20.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:861: mds_exists: ceph auth ls 2026-03-31T20:40:20.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:861: mds_exists: grep '^mds' 2026-03-31T20:40:21.156 INFO:tasks.workunit.client.0.vm03.stdout:Skipping test, no MDS found 2026-03-31T20:40:21.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2830: test_mds_tell_help_command: echo 'Skipping test, no MDS found' 2026-03-31T20:40:21.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2831: test_mds_tell_help_command: return 2026-03-31T20:40:21.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:40:21.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mds_messenger_dump 2026-03-31T20:40:21.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2945: test_mds_messenger_dump: do_messenger_dump_basics_test mds.a 2026-03-31T20:40:21.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2905: do_messenger_dump_basics_test: local target=mds.a 2026-03-31T20:40:21.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2906: do_messenger_dump_basics_test: ceph tell mds.a messenger dump 2026-03-31T20:40:21.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2906: do_messenger_dump_basics_test: expect_true jq --exit-status '.messengers | length > 0' 2026-03-31T20:40:21.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:21.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messengers | length > 0' 2026-03-31T20:40:21.438 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:21.436+0000 7f2983cc9000 -1 client.13054 resolve_mds: no MDS daemons found by name `a' 2026-03-31T20:40:21.438 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:21.436+0000 7f2983cc9000 -1 client.13054 FSMap: e97 2026-03-31T20:40:21.438 INFO:tasks.workunit.client.0.vm03.stderr:btime 2026-03-31T20:40:17:904982+0000 2026-03-31T20:40:21.438 INFO:tasks.workunit.client.0.vm03.stderr:enable_multiple, ever_enabled_multiple: 1,1 2026-03-31T20:40:21.438 INFO:tasks.workunit.client.0.vm03.stderr:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-31T20:40:21.438 INFO:tasks.workunit.client.0.vm03.stderr:legacy client fscid: -1 2026-03-31T20:40:21.438 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:40:21.438 INFO:tasks.workunit.client.0.vm03.stderr:No filesystems configured 2026-03-31T20:40:21.438 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:40:21.438 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: problem getting command descriptions from mds.a 2026-03-31T20:40:21.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:21.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: ceph tell mds.a messenger dump 2026-03-31T20:40:21.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: jq -r '.messengers[]' 2026-03-31T20:40:21.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: read messenger 2026-03-31T20:40:21.504 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:21.504+0000 7f3a9089f000 -1 client.13037 resolve_mds: no MDS daemons found by name `a' 2026-03-31T20:40:21.504 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:21.504+0000 7f3a9089f000 -1 client.13037 FSMap: e97 2026-03-31T20:40:21.504 INFO:tasks.workunit.client.0.vm03.stderr:btime 2026-03-31T20:40:17:904982+0000 2026-03-31T20:40:21.504 INFO:tasks.workunit.client.0.vm03.stderr:enable_multiple, ever_enabled_multiple: 1,1 2026-03-31T20:40:21.504 INFO:tasks.workunit.client.0.vm03.stderr:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-31T20:40:21.504 INFO:tasks.workunit.client.0.vm03.stderr:legacy client fscid: -1 2026-03-31T20:40:21.504 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:40:21.504 INFO:tasks.workunit.client.0.vm03.stderr:No filesystems configured 2026-03-31T20:40:21.504 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:40:21.504 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: problem getting command descriptions from mds.a 2026-03-31T20:40:21.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:40:21.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mgr_tell 2026-03-31T20:40:21.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2851: test_mgr_tell: ceph tell mgr version 2026-03-31T20:40:21.770 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:40:21.770 INFO:tasks.workunit.client.0.vm03.stdout: "version": "20.2.0-721-g5bb32787", 2026-03-31T20:40:21.770 INFO:tasks.workunit.client.0.vm03.stdout: "release": "tentacle", 2026-03-31T20:40:21.770 INFO:tasks.workunit.client.0.vm03.stdout: "release_type": "stable" 2026-03-31T20:40:21.770 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:40:21.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:40:21.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mgr_devices 2026-03-31T20:40:21.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2856: test_mgr_devices: ceph device ls 2026-03-31T20:40:22.173 INFO:tasks.workunit.client.0.vm03.stdout:DEVICE HOST:DEV DAEMONS WEAR LIFE EXPECTANCY 2026-03-31T20:40:22.173 INFO:tasks.workunit.client.0.vm03.stdout:DWNBRSTVMM03001 vm03:vdb osd.0 2026-03-31T20:40:22.173 INFO:tasks.workunit.client.0.vm03.stdout:DWNBRSTVMM03002 vm03:vdc osd.1 2026-03-31T20:40:22.173 INFO:tasks.workunit.client.0.vm03.stdout:DWNBRSTVMM03003 vm03:vdd osd.2 2026-03-31T20:40:22.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2857: test_mgr_devices: expect_false ceph device info doesnotexist 2026-03-31T20:40:22.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:40:22.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph device info doesnotexist 2026-03-31T20:40:22.325 INFO:tasks.ceph.mgr.x.vm03.stderr:2026-03-31T20:40:22.324+0000 7f026e475640 -1 mgr.server reply reply (2) No such file or directory device doesnotexist not found 2026-03-31T20:40:22.325 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: device doesnotexist not found 2026-03-31T20:40:22.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:40:22.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2858: test_mgr_devices: expect_false ceph device get-health-metrics doesnotexist 2026-03-31T20:40:22.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:35: expect_false: set -x 2026-03-31T20:40:22.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: ceph device get-health-metrics doesnotexist 2026-03-31T20:40:22.473 INFO:tasks.ceph.mgr.x.vm03.stderr:2026-03-31T20:40:22.472+0000 7f0266465640 -1 mgr.server reply reply (2) No such file or directory device doesnotexist not found 2026-03-31T20:40:22.473 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: device doesnotexist not found 2026-03-31T20:40:22.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:36: expect_false: return 0 2026-03-31T20:40:22.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:40:22.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3100: : test_mgr_messenger_dump 2026-03-31T20:40:22.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2941: test_mgr_messenger_dump: do_messenger_dump_basics_test mgr 2026-03-31T20:40:22.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2905: do_messenger_dump_basics_test: local target=mgr 2026-03-31T20:40:22.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2906: do_messenger_dump_basics_test: ceph tell mgr messenger dump 2026-03-31T20:40:22.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2906: do_messenger_dump_basics_test: expect_true jq --exit-status '.messengers | length > 0' 2026-03-31T20:40:22.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:22.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messengers | length > 0' 2026-03-31T20:40:22.752 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:22.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:22.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: ceph tell mgr messenger dump 2026-03-31T20:40:22.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: jq -r '.messengers[]' 2026-03-31T20:40:22.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: read messenger 2026-03-31T20:40:22.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: ceph tell mgr messenger dump mgr all 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: dump='{ 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "name": "mgr", 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "messenger": { 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 7908166830183625570, 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "my_name": { 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mgr", 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "num": 5953 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "my_addrs": { 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 832516962 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "listen_sockets": [], 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "dispatch_queue": { 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "length": 0, 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "max_age_ago": "0s" 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "connections_count": 4, 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "connections": [ 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 7908166830183625570, 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 47, 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 8, 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:22.899 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mgr", 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 21, 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 10, 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3302", 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6791", 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:22.900 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 7908166830183625570, 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 7, 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3302", 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6791", 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3302", 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:22.901 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3301", 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6790", 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 7908166830183625570, 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 28, 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 6, 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3301", 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6790", 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "0.931999s", 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "0.930915s", 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 4, 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.902 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3301", 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 246, 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6789", 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 7908166830183625570, 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 5, 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.903 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6789", 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "anon_conns": [], 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "accepting_conns": [], 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "deleted_conns": [ 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 7908166830183625570, 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 5, 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:22.904 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6789", 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 7908166830183625570, 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.905 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 7, 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3302", 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6791", 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3302", 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:22.906 INFO:tasks.workunit.client.0.vm03.stderr: "local_connection": [ 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_NONE", 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 7908166830183625570, 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": true 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 1, 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mgr", 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 832516962 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "15m", 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "22m", 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "v1": { 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "state": "NONE", 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown" 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr:}' 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2909: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("messenger")' 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:22.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("messenger")' 2026-03-31T20:40:22.908 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:22.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:22.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2910: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("name")' 2026-03-31T20:40:22.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:22.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("name")' 2026-03-31T20:40:22.917 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:22.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:22.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2911: do_messenger_dump_basics_test: expect_true jq --arg expected_messenger mgr --exit-status ' 2026-03-31T20:40:22.918 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:40:22.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:22.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --arg expected_messenger mgr --exit-status ' 2026-03-31T20:40:22.918 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:40:22.927 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:22.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:22.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2913: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | type == "object"' 2026-03-31T20:40:22.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:22.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | type == "object"' 2026-03-31T20:40:22.935 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:22.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:22.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2914: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | 2026-03-31T20:40:22.936 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:40:22.936 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:40:22.936 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:40:22.936 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:40:22.936 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:40:22.936 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:40:22.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:22.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | 2026-03-31T20:40:22.936 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:40:22.936 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:40:22.936 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:40:22.936 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:40:22.936 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:40:22.936 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:40:22.945 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:22.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:22.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: read messenger 2026-03-31T20:40:22.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: ceph tell mgr messenger dump mgr-2764575205649382731 all 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: dump='{ 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: "name": "mgr-2764575205649382731", 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: "messenger": { 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2764575205649382731, 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: "my_name": { 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mgr", 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: "num": 5953 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: "my_addrs": { 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: "listen_sockets": [ 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 23, 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: "dispatch_queue": { 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: "length": 0, 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: "max_age_ago": "0s" 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: "connections_count": 3, 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: "connections": [ 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3099256197 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.021 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 44, 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 8, 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "type": 1, 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "id": 0, 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3099256197 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "3s", 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "3s", 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 20, 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3099256197 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 7, 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.022 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1623152675 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 49, 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 12, 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "type": 1, 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "id": 1, 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1623152675 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "3s", 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "3s", 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 22, 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1623152675 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 6, 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.023 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3915604555 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 43, 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 7, 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "type": 1, 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "id": 2, 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3915604555 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "3s", 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "3s", 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 20, 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3915604555 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.024 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 7, 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "anon_conns": [ 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 53, 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 17, 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "type": 16, 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "id": "x" 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "id": 6702, 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 6702, 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1428989468 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 26, 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1428989468 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.025 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 4, 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 36, 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 2, 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "type": 16, 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "id": "x" 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "id": 6663, 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 6663, 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2742811205 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.026 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 14, 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2742811205 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 7, 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 37, 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 3, 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.027 INFO:tasks.workunit.client.0.vm03.stderr: "type": 4, 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "id": "0" 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "id": 0, 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 4164, 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6804", 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "3s", 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "3s", 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 18, 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6804", 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 46, 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 39, 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 4, 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.028 INFO:tasks.workunit.client.0.vm03.stderr: "type": 4, 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "id": "1" 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "id": 1, 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 4145, 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6800", 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "3s", 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "3s", 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 15, 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6800", 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 25, 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 40, 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 5, 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.029 INFO:tasks.workunit.client.0.vm03.stderr: "type": 16, 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "id": "x" 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "id": 5999, 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 5999, 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3667604046 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 19, 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3667604046 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 4, 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 42, 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 6, 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.030 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "type": 16, 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "id": "x" 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "id": 6011, 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 6011, 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2236585882 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 16, 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2236585882 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 4, 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.031 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 46, 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 10, 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "type": 4, 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "id": "2" 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "id": 2, 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 4144, 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6808", 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "3s", 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "3s", 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 21, 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6808", 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.032 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 20, 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 48, 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 11, 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "type": 16, 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "id": "x" 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mgr", 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "id": 5953, 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 5953, 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 832516962 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "3s", 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "3s", 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 17, 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 832516962 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.033 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 7, 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 1438, 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "type": 8, 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "id": "admin" 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "id": 13076, 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 13076, 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1159524449 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "0.34s", 2026-03-31T20:40:23.034 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "0.343406s", 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1159524449 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 4, 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 1440, 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "type": 8, 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "id": "admin" 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "id": 13082, 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 13082, 2026-03-31T20:40:23.035 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 562932 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "0.192s", 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "0.193973s", 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 562932 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 4, 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 45, 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 1442, 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "type": 8, 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "id": "admin" 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "id": 13091, 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 13091, 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.036 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1425182310 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "0s", 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "0.00018706s", 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 1439, 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1425182310 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 4, 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 1441, 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "type": 8, 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "id": "admin" 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "id": 16584, 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 16584, 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.037 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3513291679 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "0.12s", 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "0.120622s", 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3513291679 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 4, 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 1439, 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "type": 8, 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "id": "admin" 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "id": 16566, 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 16566, 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3548848795 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.038 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "0.264s", 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "0.266845s", 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3548848795 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 4, 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "accepting_conns": [ 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 53, 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 17, 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "type": 16, 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "id": "x" 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "id": 6702, 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 6702, 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1428989468 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.039 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 26, 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1428989468 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 4, 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 36, 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 2, 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "type": 16, 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "id": "x" 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "id": 6663, 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 6663, 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2742811205 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.040 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 14, 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2742811205 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 7, 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 37, 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 3, 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "type": 4, 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "id": "0" 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "id": 0, 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 4164, 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6804", 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "3s", 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "3s", 2026-03-31T20:40:23.041 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 18, 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6804", 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 46, 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 39, 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 4, 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "type": 4, 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "id": "1" 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "id": 1, 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 4145, 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6800", 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.042 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "3s", 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "3s", 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 15, 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6800", 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 25, 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 40, 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 5, 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "type": 16, 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "id": "x" 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "id": 5999, 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 5999, 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3667604046 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.043 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 19, 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3667604046 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 4, 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 42, 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 6, 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "type": 16, 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "id": "x" 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "id": 6011, 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 6011, 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2236585882 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.044 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 16, 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2236585882 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 4, 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 46, 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 10, 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "type": 4, 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "id": "2" 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "id": 2, 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 4144, 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6808", 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "3s", 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "3s", 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 21, 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.045 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6808", 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 20, 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 48, 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 11, 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "type": 16, 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "id": "x" 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mgr", 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "id": 5953, 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 5953, 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 832516962 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "3s", 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "3s", 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 17, 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.046 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 832516962 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 7, 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 1438, 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "type": 8, 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "id": "admin" 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "id": 13076, 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 13076, 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1159524449 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "0.34s", 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "0.34352s", 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1159524449 2026-03-31T20:40:23.047 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 4, 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 1440, 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "type": 8, 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "id": "admin" 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "id": 13082, 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 13082, 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 562932 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "0.192s", 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "0.194086s", 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 562932 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.048 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 4, 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 45, 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 1442, 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "type": 8, 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "id": "admin" 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "id": 13091, 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 13091, 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1425182310 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "0s", 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "0.000300431s", 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 1439, 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1425182310 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.049 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 4, 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 1441, 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "type": 8, 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "id": "admin" 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "id": 16584, 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 16584, 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3513291679 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "0.12s", 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "0.120735s", 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3513291679 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 4, 2026-03-31T20:40:23.050 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 2764575205649382731, 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 1439, 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "type": 8, 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "id": "admin" 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "id": 16566, 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 16566, 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3548848795 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "0.264s", 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "0.266957s", 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.051 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:23.052 INFO:tasks.workunit.client.0.vm03.stderr: /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:23.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2910: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("name")' 2026-03-31T20:40:23.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:23.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("name")' 2026-03-31T20:40:23.052 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:23.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:23.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2911: do_messenger_dump_basics_test: expect_true jq --arg expected_messenger mgr-2764575205649382731 --exit-status ' 2026-03-31T20:40:23.052 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:40:23.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:23.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --arg expected_messenger mgr-2764575205649382731 --exit-status ' 2026-03-31T20:40:23.052 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:40:23.052 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:23.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:23.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2913: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | type == "object"' 2026-03-31T20:40:23.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:23.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | type == "object"' 2026-03-31T20:40:23.061 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:23.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:23.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2914: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | 2026-03-31T20:40:23.061 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:40:23.061 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:40:23.061 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:40:23.061 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:40:23.061 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:40:23.061 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:40:23.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:23.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | 2026-03-31T20:40:23.061 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:40:23.061 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:40:23.061 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:40:23.061 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:40:23.061 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:40:23.061 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:40:23.070 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:23.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:23.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: read messenger 2026-03-31T20:40:23.071 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: ceph tell mgr messenger dump radosclient-11644255491061396046 all 2026-03-31T20:40:23.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: dump='{ 2026-03-31T20:40:23.144 INFO:tasks.workunit.client.0.vm03.stderr: "name": "radosclient-11644255491061396046", 2026-03-31T20:40:23.144 INFO:tasks.workunit.client.0.vm03.stderr: "messenger": { 2026-03-31T20:40:23.144 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 11644255491061396046, 2026-03-31T20:40:23.144 INFO:tasks.workunit.client.0.vm03.stderr: "my_name": { 2026-03-31T20:40:23.144 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:40:23.144 INFO:tasks.workunit.client.0.vm03.stderr: "num": 5999 2026-03-31T20:40:23.144 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.144 INFO:tasks.workunit.client.0.vm03.stderr: "my_addrs": { 2026-03-31T20:40:23.144 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.144 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.144 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3667604046 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "listen_sockets": [], 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "dispatch_queue": { 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "length": 0, 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "max_age_ago": "0s" 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "connections_count": 5, 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "connections": [ 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6800", 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 11644255491061396046, 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 50, 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 6, 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6800", 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 23, 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6800", 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:40:23.145 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 26, 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 11644255491061396046, 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 38, 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 5, 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mgr", 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.146 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 17, 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 3, 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6789", 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 11644255491061396046, 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 4, 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.147 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6789", 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3302", 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6791", 2026-03-31T20:40:23.148 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 11644255491061396046, 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 29, 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 3, 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3302", 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6791", 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "6s", 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "6s", 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 8, 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3302", 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 240, 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.149 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3301", 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6790", 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 11644255491061396046, 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 2, 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3301", 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6790", 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.150 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3301", 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "anon_conns": [], 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "accepting_conns": [], 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "deleted_conns": [ 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 11644255491061396046, 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 2, 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3301", 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6790", 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.151 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3301", 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 11644255491061396046, 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 4, 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.152 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6789", 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "local_connection": [ 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_NONE", 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 11644255491061396046, 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": true 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 1, 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.153 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3667604046 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "22m", 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: "v1": { 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: "state": "NONE", 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown" 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr:}' 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2909: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("messenger")' 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("messenger")' 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2910: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("name")' 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:23.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("name")' 2026-03-31T20:40:23.162 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:23.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:23.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2911: do_messenger_dump_basics_test: expect_true jq --arg expected_messenger radosclient-11644255491061396046 --exit-status ' 2026-03-31T20:40:23.162 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:40:23.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:23.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --arg expected_messenger radosclient-11644255491061396046 --exit-status ' 2026-03-31T20:40:23.162 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:40:23.171 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:23.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:23.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2913: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | type == "object"' 2026-03-31T20:40:23.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:23.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | type == "object"' 2026-03-31T20:40:23.180 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:23.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:23.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2914: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | 2026-03-31T20:40:23.180 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:40:23.180 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:40:23.180 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:40:23.180 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:40:23.180 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:40:23.180 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:40:23.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:23.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | 2026-03-31T20:40:23.180 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:40:23.180 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:40:23.180 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:40:23.180 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:40:23.180 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:40:23.180 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:40:23.189 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:23.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:23.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: read messenger 2026-03-31T20:40:23.189 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: ceph tell mgr messenger dump radosclient-12068228070141371932 all 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: dump='{ 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "name": "radosclient-12068228070141371932", 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "messenger": { 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 12068228070141371932, 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "my_name": { 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "num": 6702 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "my_addrs": { 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1428989468 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "listen_sockets": [], 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "dispatch_queue": { 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "length": 0, 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "max_age_ago": "0s" 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "connections_count": 5, 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "connections": [ 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6800", 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 12068228070141371932, 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 54, 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 6, 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6800", 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:40:23.263 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 28, 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6800", 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 27, 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 12068228070141371932, 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 51, 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 5, 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.264 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mgr", 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 27, 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 16, 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3302", 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6791", 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.265 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 12068228070141371932, 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 4, 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3302", 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6791", 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3302", 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.266 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6789", 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 12068228070141371932, 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 52, 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 3, 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6789", 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "4s", 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "4s", 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 26, 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.267 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 577, 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3301", 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6790", 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 12068228070141371932, 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 2, 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.268 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3301", 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6790", 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3301", 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "anon_conns": [], 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "accepting_conns": [], 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "deleted_conns": [ 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 12068228070141371932, 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 2, 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.269 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3301", 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6790", 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3301", 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:40:23.270 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 12068228070141371932, 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 4, 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3302", 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6791", 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3302", 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.271 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "local_connection": [ 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_NONE", 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 12068228070141371932, 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": true 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 1, 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1428989468 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "22m", 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.272 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.273 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.273 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.273 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.273 INFO:tasks.workunit.client.0.vm03.stderr: "v1": { 2026-03-31T20:40:23.273 INFO:tasks.workunit.client.0.vm03.stderr: "state": "NONE", 2026-03-31T20:40:23.273 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.273 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:40:23.273 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown" 2026-03-31T20:40:23.273 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.273 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.273 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.273 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.273 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.273 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.273 INFO:tasks.workunit.client.0.vm03.stderr:}' 2026-03-31T20:40:23.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2909: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("messenger")' 2026-03-31T20:40:23.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:23.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("messenger")' 2026-03-31T20:40:23.273 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:23.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:23.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2910: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("name")' 2026-03-31T20:40:23.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:23.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("name")' 2026-03-31T20:40:23.281 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:23.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:23.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2911: do_messenger_dump_basics_test: expect_true jq --arg expected_messenger radosclient-12068228070141371932 --exit-status ' 2026-03-31T20:40:23.281 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:40:23.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:23.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --arg expected_messenger radosclient-12068228070141371932 --exit-status ' 2026-03-31T20:40:23.281 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:40:23.290 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:23.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:23.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2913: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | type == "object"' 2026-03-31T20:40:23.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:23.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | type == "object"' 2026-03-31T20:40:23.300 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:23.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:23.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2914: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | 2026-03-31T20:40:23.300 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:40:23.300 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:40:23.300 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:40:23.300 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:40:23.300 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:40:23.300 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:40:23.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:23.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | 2026-03-31T20:40:23.300 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:40:23.300 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:40:23.300 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:40:23.300 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:40:23.300 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:40:23.300 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:40:23.309 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:23.309 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:23.309 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: read messenger 2026-03-31T20:40:23.310 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: ceph tell mgr messenger dump radosclient-14734355047442782106 all 2026-03-31T20:40:23.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: dump='{ 2026-03-31T20:40:23.383 INFO:tasks.workunit.client.0.vm03.stderr: "name": "radosclient-14734355047442782106", 2026-03-31T20:40:23.383 INFO:tasks.workunit.client.0.vm03.stderr: "messenger": { 2026-03-31T20:40:23.383 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 14734355047442782106, 2026-03-31T20:40:23.383 INFO:tasks.workunit.client.0.vm03.stderr: "my_name": { 2026-03-31T20:40:23.383 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:40:23.383 INFO:tasks.workunit.client.0.vm03.stderr: "num": 6011 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "my_addrs": { 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2236585882 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "listen_sockets": [], 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "dispatch_queue": { 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "length": 0, 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "max_age_ago": "0s" 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "connections_count": 4, 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "connections": [ 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 14734355047442782106, 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 41, 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 5, 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mgr", 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 18, 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.384 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 5, 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3301", 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6790", 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 14734355047442782106, 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 4, 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3301", 2026-03-31T20:40:23.385 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6790", 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3301", 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6789", 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 14734355047442782106, 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.386 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 3, 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6789", 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.387 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3302", 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6791", 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 14734355047442782106, 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 31, 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 2, 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3302", 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6791", 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "6s", 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "6s", 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 15, 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3302", 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.388 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 242, 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "anon_conns": [], 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "accepting_conns": [], 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "deleted_conns": [ 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 14734355047442782106, 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 3, 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6789", 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.389 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 14734355047442782106, 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 4, 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3301", 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6790", 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.390 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3301", 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "local_connection": [ 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_NONE", 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 14734355047442782106, 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": true 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 1, 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2236585882 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "22m", 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.391 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr: "v1": { 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr: "state": "NONE", 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown" 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr:}' 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2909: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("messenger")' 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("messenger")' 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2910: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("name")' 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:23.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("name")' 2026-03-31T20:40:23.401 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:23.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:23.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2911: do_messenger_dump_basics_test: expect_true jq --arg expected_messenger radosclient-14734355047442782106 --exit-status ' 2026-03-31T20:40:23.401 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:40:23.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:23.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --arg expected_messenger radosclient-14734355047442782106 --exit-status ' 2026-03-31T20:40:23.401 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:40:23.410 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:23.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:23.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2913: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | type == "object"' 2026-03-31T20:40:23.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:23.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | type == "object"' 2026-03-31T20:40:23.419 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:23.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:23.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2914: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | 2026-03-31T20:40:23.419 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:40:23.419 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:40:23.419 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:40:23.419 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:40:23.419 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:40:23.419 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:40:23.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:23.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | 2026-03-31T20:40:23.419 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:40:23.419 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:40:23.419 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:40:23.419 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:40:23.419 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:40:23.419 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:40:23.428 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:23.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:23.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: read messenger 2026-03-31T20:40:23.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: ceph tell mgr messenger dump radosclient-3940796762142014021 all 2026-03-31T20:40:23.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2908: do_messenger_dump_basics_test: dump='{ 2026-03-31T20:40:23.502 INFO:tasks.workunit.client.0.vm03.stderr: "name": "radosclient-3940796762142014021", 2026-03-31T20:40:23.502 INFO:tasks.workunit.client.0.vm03.stderr: "messenger": { 2026-03-31T20:40:23.502 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3940796762142014021, 2026-03-31T20:40:23.502 INFO:tasks.workunit.client.0.vm03.stderr: "my_name": { 2026-03-31T20:40:23.502 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:40:23.502 INFO:tasks.workunit.client.0.vm03.stderr: "num": 6663 2026-03-31T20:40:23.502 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.502 INFO:tasks.workunit.client.0.vm03.stderr: "my_addrs": { 2026-03-31T20:40:23.502 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.502 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.502 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.502 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.502 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2742811205 2026-03-31T20:40:23.502 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.502 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.502 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.502 INFO:tasks.workunit.client.0.vm03.stderr: "listen_sockets": [], 2026-03-31T20:40:23.502 INFO:tasks.workunit.client.0.vm03.stderr: "dispatch_queue": { 2026-03-31T20:40:23.502 INFO:tasks.workunit.client.0.vm03.stderr: "length": 0, 2026-03-31T20:40:23.502 INFO:tasks.workunit.client.0.vm03.stderr: "max_age_ago": "0s" 2026-03-31T20:40:23.502 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.502 INFO:tasks.workunit.client.0.vm03.stderr: "connections_count": 7, 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "connections": [ 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6804", 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 3940796762142014021, 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 32, 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 9, 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6804", 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "12m", 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "59s", 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "59s", 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 34, 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6804", 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 950776786 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.503 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 47, 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 3940796762142014021, 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 35, 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 8, 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mgr", 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 14, 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.504 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6812", 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2408938827 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 2, 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6800", 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 3940796762142014021, 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 34, 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 7, 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6800", 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "3m", 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "3m", 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 17, 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.505 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6800", 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 3578452477 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 24, 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6808", 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 3940796762142014021, 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 33, 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 6, 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "type": "osd", 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6808", 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.506 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "59s", 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "59s", 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 13, 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6808", 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 1523795840 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 19, 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3301", 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6790", 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 3940796762142014021, 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 4, 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.507 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3301", 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6790", 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3301", 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3302", 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6791", 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 3940796762142014021, 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:23.508 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 3, 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3302", 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6791", 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3302", 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.509 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6789", 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "async_connection": { 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CONNECTION_ESTABLISHED", 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 3940796762142014021, 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": 30, 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 2, 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6789", 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "6s", 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "6s", 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 8, 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3300", 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "state": "READY", 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "secure", 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 574, 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.510 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "AES-128-GCM", 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "AES-128-GCM" 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 0 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "anon_conns": [], 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "accepting_conns": [], 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "deleted_conns": [ 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 3940796762142014021, 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 4, 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3301", 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6790", 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3301", 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.511 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 2 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_CLOSED", 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 3940796762142014021, 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "connected": false, 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": false 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 3, 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "type": "mon", 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3302", 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v1", 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:6791", 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "14m", 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "14m", 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "type": "v2", 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:3302", 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "v2": { 2026-03-31T20:40:23.512 INFO:tasks.workunit.client.0.vm03.stderr: "state": "CLOSED", 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown", 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "rev1": true, 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "crypto": { 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "PLAIN", 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "PLAIN" 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "compression": { 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "rx": "UNCOMPRESSED", 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "tx": "UNCOMPRESSED" 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: ], 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "local_connection": [ 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "state": "STATE_NONE", 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "messenger_nonce": 3940796762142014021, 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "status": { 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "connected": true, 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "loopback": true 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "socket_fd": null, 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "tcp_info": null, 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "conn_id": 1, 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "peer": { 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "entity_name": { 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "type": 0, 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "id": "" 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "type": "client", 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "id": -1, 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "global_id": 0, 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "addr": { 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "addrvec": [ 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "type": "any", 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "192.168.123.103:0", 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 2742811205 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "last_connect_started_ago": "22m", 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "last_active_ago": "14m", 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "recv_start_time_ago": "22m", 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "last_tick_id": 0, 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "socket_addr": { 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "target_addr": { 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "type": "none", 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "addr": "(unrecognized address family 0)", 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "nonce": 0 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "port": -1, 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "protocol": { 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "v1": { 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "state": "NONE", 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "connect_seq": 0, 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "peer_global_seq": 0, 2026-03-31T20:40:23.513 INFO:tasks.workunit.client.0.vm03.stderr: "con_mode": "unknown" 2026-03-31T20:40:23.514 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.514 INFO:tasks.workunit.client.0.vm03.stderr: }, 2026-03-31T20:40:23.514 INFO:tasks.workunit.client.0.vm03.stderr: "worker_id": 1 2026-03-31T20:40:23.514 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.514 INFO:tasks.workunit.client.0.vm03.stderr: ] 2026-03-31T20:40:23.514 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-31T20:40:23.514 INFO:tasks.workunit.client.0.vm03.stderr:}' 2026-03-31T20:40:23.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2909: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("messenger")' 2026-03-31T20:40:23.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:23.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("messenger")' 2026-03-31T20:40:23.514 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:23.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:23.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2910: do_messenger_dump_basics_test: expect_true jq --exit-status 'has("name")' 2026-03-31T20:40:23.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:23.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status 'has("name")' 2026-03-31T20:40:23.520 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:23.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:23.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2911: do_messenger_dump_basics_test: expect_true jq --arg expected_messenger radosclient-3940796762142014021 --exit-status ' 2026-03-31T20:40:23.520 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:40:23.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:23.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --arg expected_messenger radosclient-3940796762142014021 --exit-status ' 2026-03-31T20:40:23.520 INFO:tasks.workunit.client.0.vm03.stderr: .name == $expected_messenger' 2026-03-31T20:40:23.529 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:23.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:23.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2913: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | type == "object"' 2026-03-31T20:40:23.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:23.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | type == "object"' 2026-03-31T20:40:23.538 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:23.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:23.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2914: do_messenger_dump_basics_test: expect_true jq --exit-status '.messenger | 2026-03-31T20:40:23.538 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:40:23.538 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:40:23.538 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:40:23.538 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:40:23.538 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:40:23.538 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:40:23.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:41: expect_true: set -x 2026-03-31T20:40:23.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: jq --exit-status '.messenger | 2026-03-31T20:40:23.538 INFO:tasks.workunit.client.0.vm03.stderr: all([.connections, 2026-03-31T20:40:23.538 INFO:tasks.workunit.client.0.vm03.stderr: .listen_sockets, 2026-03-31T20:40:23.538 INFO:tasks.workunit.client.0.vm03.stderr: .anon_conns, 2026-03-31T20:40:23.538 INFO:tasks.workunit.client.0.vm03.stderr: .accepting_conns, 2026-03-31T20:40:23.538 INFO:tasks.workunit.client.0.vm03.stderr: .deleted_conns][]; 2026-03-31T20:40:23.538 INFO:tasks.workunit.client.0.vm03.stderr: type == "array")' 2026-03-31T20:40:23.547 INFO:tasks.workunit.client.0.vm03.stdout:true 2026-03-31T20:40:23.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:42: expect_true: return 0 2026-03-31T20:40:23.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:2907: do_messenger_dump_basics_test: read messenger 2026-03-31T20:40:23.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3101: : set +x 2026-03-31T20:40:23.751 INFO:tasks.workunit.client.0.vm03.stdout:OK 2026-03-31T20:40:23.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:3109: : echo OK 2026-03-31T20:40:23.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test.sh:1: : rm -fr /tmp/cephtool.sYl 2026-03-31T20:40:23.753 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-31T20:40:23.753 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-31T20:40:23.759 INFO:tasks.workunit:Running workunit cephtool/test_daemon.sh... 2026-03-31T20:40:23.759 DEBUG:teuthology.orchestra.run.vm03:workunit test cephtool/test_daemon.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=0392f78529848ec72469e8e431875cb98d3a5fb4 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_daemon.sh 2026-03-31T20:40:23.809 INFO:tasks.workunit.client.0.vm03.stdout:note: assuming mon.a is on the current host 2026-03-31T20:40:23.809 INFO:tasks.workunit.client.0.vm03.stderr:+ echo note: assuming mon.a is on the current host 2026-03-31T20:40:23.809 INFO:tasks.workunit.client.0.vm03.stderr:+ CEPH='sudo ceph' 2026-03-31T20:40:23.809 INFO:tasks.workunit.client.0.vm03.stderr:+ sudo ceph daemon mon.a version 2026-03-31T20:40:23.809 INFO:tasks.workunit.client.0.vm03.stderr:+ grep version 2026-03-31T20:40:23.887 INFO:tasks.workunit.client.0.vm03.stdout: "version": "20.2.0-721-g5bb32787", 2026-03-31T20:40:23.888 INFO:tasks.workunit.client.0.vm03.stderr:++ sudo ceph daemon mon.a config get debug_ms 2026-03-31T20:40:23.888 INFO:tasks.workunit.client.0.vm03.stderr:++ grep debug_ms 2026-03-31T20:40:23.888 INFO:tasks.workunit.client.0.vm03.stderr:++ sed -e 's/.*: //' -e 's/["\}\\]//g' 2026-03-31T20:40:23.969 INFO:tasks.workunit.client.0.vm03.stderr:+ old_ms=1/1 2026-03-31T20:40:23.969 INFO:tasks.workunit.client.0.vm03.stderr:+ sudo ceph daemon mon.a config set debug_ms 13 2026-03-31T20:40:24.031 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:40:24.031 INFO:tasks.workunit.client.0.vm03.stdout: "success": "" 2026-03-31T20:40:24.031 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:40:24.040 INFO:tasks.workunit.client.0.vm03.stderr:++ sudo ceph daemon mon.a config get debug_ms 2026-03-31T20:40:24.040 INFO:tasks.workunit.client.0.vm03.stderr:++ grep debug_ms 2026-03-31T20:40:24.040 INFO:tasks.workunit.client.0.vm03.stderr:++ sed -e 's/.*: //' -e 's/["\}\\]//g' 2026-03-31T20:40:24.109 INFO:tasks.workunit.client.0.vm03.stderr:+ new_ms=13/13 2026-03-31T20:40:24.109 INFO:tasks.workunit.client.0.vm03.stderr:+ '[' 13/13 = 13/13 ']' 2026-03-31T20:40:24.109 INFO:tasks.workunit.client.0.vm03.stderr:+ sudo ceph daemon mon.a config set debug_ms 1/1 2026-03-31T20:40:24.170 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-31T20:40:24.171 INFO:tasks.workunit.client.0.vm03.stdout: "success": "" 2026-03-31T20:40:24.171 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-31T20:40:24.178 INFO:tasks.workunit.client.0.vm03.stderr:++ sudo ceph daemon mon.a config get debug_ms 2026-03-31T20:40:24.178 INFO:tasks.workunit.client.0.vm03.stderr:++ grep debug_ms 2026-03-31T20:40:24.178 INFO:tasks.workunit.client.0.vm03.stderr:++ sed -e 's/.*: //' -e 's/["\}\\]//g' 2026-03-31T20:40:24.248 INFO:tasks.workunit.client.0.vm03.stderr:+ new_ms=1/1 2026-03-31T20:40:24.248 INFO:tasks.workunit.client.0.vm03.stderr:+ '[' 1/1 = 1/1 ']' 2026-03-31T20:40:24.248 INFO:tasks.workunit.client.0.vm03.stderr:+ expect_false sudo ceph daemon mon.a bogus_command_blah foo 2026-03-31T20:40:24.248 INFO:tasks.workunit.client.0.vm03.stderr:+ set -x 2026-03-31T20:40:24.248 INFO:tasks.workunit.client.0.vm03.stderr:+ sudo ceph daemon mon.a bogus_command_blah foo 2026-03-31T20:40:24.311 INFO:tasks.workunit.client.0.vm03.stderr:no valid command found; 10 closest matches: 2026-03-31T20:40:24.311 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-31T20:40:24.311 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-31T20:40:24.311 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-31T20:40:24.311 INFO:tasks.workunit.client.0.vm03.stderr:abort 2026-03-31T20:40:24.311 INFO:tasks.workunit.client.0.vm03.stderr:add_bootstrap_peer_hint 2026-03-31T20:40:24.311 INFO:tasks.workunit.client.0.vm03.stderr:add_bootstrap_peer_hintv 2026-03-31T20:40:24.311 INFO:tasks.workunit.client.0.vm03.stderr:assert 2026-03-31T20:40:24.311 INFO:tasks.workunit.client.0.vm03.stderr:compact 2026-03-31T20:40:24.311 INFO:tasks.workunit.client.0.vm03.stderr:config diff 2026-03-31T20:40:24.311 INFO:tasks.workunit.client.0.vm03.stderr:config diff get 2026-03-31T20:40:24.311 INFO:tasks.workunit.client.0.vm03.stderr:admin_socket: invalid command 2026-03-31T20:40:24.313 INFO:tasks.workunit.client.0.vm03.stderr:+ return 0 2026-03-31T20:40:24.313 INFO:tasks.workunit.client.0.vm03.stderr:+ set +e 2026-03-31T20:40:24.313 INFO:tasks.workunit.client.0.vm03.stderr:++ sudo ceph -c /not/a/ceph.conf daemon mon.a help 2026-03-31T20:40:24.372 INFO:tasks.workunit.client.0.vm03.stderr:+ OUTPUT='Can'\''t get admin socket path: unable to get conf option admin_socket for mon.a: b'\''global_init: unable to open config file from search list /not/a/ceph.conf\n'\''' 2026-03-31T20:40:24.372 INFO:tasks.workunit.client.0.vm03.stderr:+ '[' 22 '!=' 22 ']' 2026-03-31T20:40:24.372 INFO:tasks.workunit.client.0.vm03.stderr:+ echo 'Can'\''t get admin socket path: unable to get conf option admin_socket for mon.a: b'\''global_init: unable to open config file from search list /not/a/ceph.conf\n'\''' 2026-03-31T20:40:24.372 INFO:tasks.workunit.client.0.vm03.stderr:+ grep -q '.*open.*/not/a/ceph.conf' 2026-03-31T20:40:24.373 INFO:tasks.workunit.client.0.vm03.stderr:+ set -e 2026-03-31T20:40:24.373 INFO:tasks.workunit.client.0.vm03.stderr:+ echo OK 2026-03-31T20:40:24.373 INFO:tasks.workunit.client.0.vm03.stdout:OK 2026-03-31T20:40:24.374 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-31T20:40:24.374 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-31T20:40:24.422 INFO:tasks.workunit:Running workunit cephtool/test_kvstore_tool.sh... 2026-03-31T20:40:24.422 DEBUG:teuthology.orchestra.run.vm03:workunit test cephtool/test_kvstore_tool.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=0392f78529848ec72469e8e431875cb98d3a5fb4 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh 2026-03-31T20:40:24.473 INFO:tasks.workunit.client.0.vm03.stderr:++ dirname /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh 2026-03-31T20:40:24.474 INFO:tasks.workunit.client.0.vm03.stderr:+ source /home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/../../standalone/ceph-helpers.sh 2026-03-31T20:40:24.474 INFO:tasks.workunit.client.0.vm03.stderr:++ TIMEOUT=300 2026-03-31T20:40:24.474 INFO:tasks.workunit.client.0.vm03.stderr:++ WAIT_FOR_CLEAN_TIMEOUT=90 2026-03-31T20:40:24.474 INFO:tasks.workunit.client.0.vm03.stderr:++ MAX_TIMEOUT=15 2026-03-31T20:40:24.474 INFO:tasks.workunit.client.0.vm03.stderr:++ PG_NUM=4 2026-03-31T20:40:24.474 INFO:tasks.workunit.client.0.vm03.stderr:++ TMPDIR=/tmp 2026-03-31T20:40:24.474 INFO:tasks.workunit.client.0.vm03.stderr:++ CEPH_BUILD_VIRTUALENV=/tmp 2026-03-31T20:40:24.474 INFO:tasks.workunit.client.0.vm03.stderr:++ TESTDIR=/home/ubuntu/cephtest 2026-03-31T20:40:24.474 INFO:tasks.workunit.client.0.vm03.stderr:+++ uname 2026-03-31T20:40:24.474 INFO:tasks.workunit.client.0.vm03.stderr:++ '[' Linux = FreeBSD ']' 2026-03-31T20:40:24.474 INFO:tasks.workunit.client.0.vm03.stderr:++ SED=sed 2026-03-31T20:40:24.474 INFO:tasks.workunit.client.0.vm03.stderr:++ AWK=awk 2026-03-31T20:40:24.474 INFO:tasks.workunit.client.0.vm03.stderr:+++ stty -a 2026-03-31T20:40:24.475 INFO:tasks.workunit.client.0.vm03.stderr:+++ head -1 2026-03-31T20:40:24.475 INFO:tasks.workunit.client.0.vm03.stderr:+++ sed -e 's/.*columns \([0-9]*\).*/\1/' 2026-03-31T20:40:24.475 INFO:tasks.workunit.client.0.vm03.stderr:stty: 'standard input': Inappropriate ioctl for device 2026-03-31T20:40:24.475 INFO:tasks.workunit.client.0.vm03.stderr:++ termwidth= 2026-03-31T20:40:24.476 INFO:tasks.workunit.client.0.vm03.stderr:++ '[' -n '' -a '' '!=' 0 ']' 2026-03-31T20:40:24.476 INFO:tasks.workunit.client.0.vm03.stderr:++ DIFFCOLOPTS='-y ' 2026-03-31T20:40:24.476 INFO:tasks.workunit.client.0.vm03.stderr:++ KERNCORE=kernel.core_pattern 2026-03-31T20:40:24.476 INFO:tasks.workunit.client.0.vm03.stderr:++ EXTRA_OPTS= 2026-03-31T20:40:24.477 INFO:tasks.workunit.client.0.vm03.stderr:++ test '' = TESTS 2026-03-31T20:40:24.477 INFO:tasks.workunit.client.0.vm03.stderr:+ set -e 2026-03-31T20:40:24.477 INFO:tasks.workunit.client.0.vm03.stderr:+ set -o functrace 2026-03-31T20:40:24.477 INFO:tasks.workunit.client.0.vm03.stderr:+ PS4='${BASH_SOURCE[0]}:$LINENO: ${FUNCNAME[0]}: ' 2026-03-31T20:40:24.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:10: : SUDO=sudo 2026-03-31T20:40:24.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:11: : export CEPH_DEV=1 2026-03-31T20:40:24.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:11: : CEPH_DEV=1 2026-03-31T20:40:24.478 INFO:tasks.workunit.client.0.vm03.stdout:note: test ceph_kvstore_tool with bluestore 2026-03-31T20:40:24.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:13: : echo note: test ceph_kvstore_tool with bluestore 2026-03-31T20:40:24.478 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:21: : mktemp -d ./cephtool.XXX 2026-03-31T20:40:24.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:21: : TEMP_DIR=./cephtool.bHN 2026-03-31T20:40:24.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:22: : trap 'rm -fr ./cephtool.bHN' 0 2026-03-31T20:40:24.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:24: : mktemp ./cephtool.bHN/test_invalid.XXX 2026-03-31T20:40:24.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:24: : TEMP_FILE=./cephtool.bHN/test_invalid.w66 2026-03-31T20:40:24.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:69: : test_ceph_kvstore_tool 2026-03-31T20:40:24.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:29: test_ceph_kvstore_tool: ceph-objectstore-tool --data-path ./cephtool.bHN --op mkfs --no-mon-config 2026-03-31T20:40:24.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:32: test_ceph_kvstore_tool: wc -l 2026-03-31T20:40:24.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:32: test_ceph_kvstore_tool: ceph-kvstore-tool bluestore-kv ./cephtool.bHN list 2026-03-31T20:40:25.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:32: test_ceph_kvstore_tool: origin_kv_nums=12 2026-03-31T20:40:25.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:35: test_ceph_kvstore_tool: ceph-kvstore-tool bluestore-kv ./cephtool.bHN list 2026-03-31T20:40:25.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:35: test_ceph_kvstore_tool: head -n 1 2026-03-31T20:40:25.071 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:35: test_ceph_kvstore_tool: awk '{print $1}' 2026-03-31T20:40:25.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:35: test_ceph_kvstore_tool: prefix=B 2026-03-31T20:40:25.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:36: test_ceph_kvstore_tool: ceph-kvstore-tool bluestore-kv ./cephtool.bHN exists B 2026-03-31T20:40:25.279 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.276+0000 7f1b95db8d40 1 bdev(0x55627a38f800 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:25.279 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.276+0000 7f1b95db8d40 1 bdev(0x55627a38f800 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:25.279 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.276+0000 7f1b95db8d40 1 bdev(0x55627a38f800 ./cephtool.bHN/block) close 2026-03-31T20:40:25.311 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.312+0000 7f1b95db8d40 1 bdev(0x55627a38f800 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:25.312 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.312+0000 7f1b95db8d40 1 bdev(0x55627a38f800 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:25.312 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.312+0000 7f1b95db8d40 1 bluestore(./cephtool.bHN) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06 2026-03-31T20:40:25.312 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.312+0000 7f1b95db8d40 1 bdev(0x55627a38fc00 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:25.312 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.312+0000 7f1b95db8d40 1 bdev(0x55627a38fc00 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:25.312 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.312+0000 7f1b95db8d40 1 bluefs add_block_device bdev 1 path ./cephtool.bHN/block size 90 GiB 2026-03-31T20:40:25.312 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.312+0000 7f1b95db8d40 1 bluefs mount 2026-03-31T20:40:25.312 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.312+0000 7f1b95db8d40 1 bluefs _init_alloc shared, id 1, capacity 0x1680000000, block size 0x10000 2026-03-31T20:40:25.313 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.312+0000 7f1b95db8d40 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.313 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.312+0000 7f1b95db8d40 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.313 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.312+0000 7f1b95db8d40 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.313 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.312+0000 7f1b95db8d40 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.313 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.312+0000 7f1b95db8d40 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.313 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.312+0000 7f1b95db8d40 1 bluefs mount shared_bdev_used = 0 2026-03-31T20:40:25.314 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.312+0000 7f1b95db8d40 1 bluestore(./cephtool.bHN) _prepare_db_environment set db_paths to db,91804925952 db.slow,91804925952 2026-03-31T20:40:25.319 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.316+0000 7f1b95db8d40 1 bluestore(./cephtool.bHN) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-31T20:40:25.319 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.316+0000 7f1b95db8d40 1 bluestore(./cephtool.bHN) _open_super_meta old nid_max 0 2026-03-31T20:40:25.319 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.316+0000 7f1b95db8d40 1 bluestore(./cephtool.bHN) _open_super_meta old blobid_max 0 2026-03-31T20:40:25.319 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.316+0000 7f1b95db8d40 1 bluestore(./cephtool.bHN) _open_super_meta ondisk_format 4 compat_ondisk_format 3 2026-03-31T20:40:25.319 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.316+0000 7f1b95db8d40 1 bluestore(./cephtool.bHN) _open_super_meta min_alloc_size 0x1000 2026-03-31T20:40:25.319 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.316+0000 7f1b95db8d40 1 freelist init 2026-03-31T20:40:25.319 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.316+0000 7f1b95db8d40 1 freelist _read_cfg 2026-03-31T20:40:25.319 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.316+0000 7f1b95db8d40 1 bluestore(./cephtool.bHN) _open_fm effective freelist_type = bitmap, freelist_alloc_size = 0x1000, min_alloc_size = 0x1000 2026-03-31T20:40:25.319 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.316+0000 7f1b95db8d40 1 bluestore(./cephtool.bHN) _init_alloc loaded 90 GiB in 1 extents, allocator type hybrid, capacity 0x1680000000, block size 0x1000, free 0x167fffe000, fragmentation 0 2026-03-31T20:40:25.319 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.316+0000 7f1b95db8d40 1 bluefs umount 2026-03-31T20:40:25.319 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.316+0000 7f1b95db8d40 1 bdev(0x55627a38fc00 ./cephtool.bHN/block) close 2026-03-31T20:40:25.347 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.348+0000 7f1b95db8d40 1 bdev(0x55627a38fc00 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:25.348 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.348+0000 7f1b95db8d40 1 bdev(0x55627a38fc00 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:25.348 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.348+0000 7f1b95db8d40 1 bluefs add_block_device bdev 1 path ./cephtool.bHN/block size 90 GiB 2026-03-31T20:40:25.348 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.348+0000 7f1b95db8d40 1 bluefs mount 2026-03-31T20:40:25.348 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.348+0000 7f1b95db8d40 1 bluefs _init_alloc shared, id 1, capacity 0x1680000000, block size 0x10000 2026-03-31T20:40:25.348 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.348+0000 7f1b95db8d40 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.348 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.348+0000 7f1b95db8d40 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.348 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.348+0000 7f1b95db8d40 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.348 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.348+0000 7f1b95db8d40 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.348 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.348+0000 7f1b95db8d40 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.348 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.348+0000 7f1b95db8d40 1 bluefs mount shared_bdev_used = 27197440 2026-03-31T20:40:25.348 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.348+0000 7f1b95db8d40 1 bluestore(./cephtool.bHN) _prepare_db_environment set db_paths to db,91804925952 db.slow,91804925952 2026-03-31T20:40:25.353 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.352+0000 7f1b95db8d40 1 bluestore(./cephtool.bHN) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-31T20:40:25.353 INFO:tasks.workunit.client.0.vm03.stdout:(B, ) exists 2026-03-31T20:40:25.353 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.352+0000 7f1b95db8d40 1 bluefs umount 2026-03-31T20:40:25.353 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.352+0000 7f1b95db8d40 1 bdev(0x55627a38fc00 ./cephtool.bHN/block) close 2026-03-31T20:40:25.387 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.388+0000 7f1b95db8d40 1 freelist shutdown 2026-03-31T20:40:25.388 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.388+0000 7f1b95db8d40 1 bdev(0x55627a38f800 ./cephtool.bHN/block) close 2026-03-31T20:40:25.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:37: test_ceph_kvstore_tool: expect_false ceph-kvstore-tool bluestore-kv ./cephtool.bHN exists Bnotexist 2026-03-31T20:40:25.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:17: expect_false: set -x 2026-03-31T20:40:25.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:18: expect_false: ceph-kvstore-tool bluestore-kv ./cephtool.bHN exists Bnotexist 2026-03-31T20:40:25.438 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.436+0000 7f98c6154d40 1 bdev(0x55ab8e907800 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:25.438 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.436+0000 7f98c6154d40 1 bdev(0x55ab8e907800 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:25.438 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.436+0000 7f98c6154d40 1 bdev(0x55ab8e907800 ./cephtool.bHN/block) close 2026-03-31T20:40:25.476 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.476+0000 7f98c6154d40 1 bdev(0x55ab8e907800 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:25.476 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.476+0000 7f98c6154d40 1 bdev(0x55ab8e907800 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:25.476 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.476+0000 7f98c6154d40 1 bluestore(./cephtool.bHN) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06 2026-03-31T20:40:25.476 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.476+0000 7f98c6154d40 1 bdev(0x55ab8e907c00 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:25.476 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.476+0000 7f98c6154d40 1 bdev(0x55ab8e907c00 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:25.476 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.476+0000 7f98c6154d40 1 bluefs add_block_device bdev 1 path ./cephtool.bHN/block size 90 GiB 2026-03-31T20:40:25.476 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.476+0000 7f98c6154d40 1 bluefs mount 2026-03-31T20:40:25.476 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.476+0000 7f98c6154d40 1 bluefs _init_alloc shared, id 1, capacity 0x1680000000, block size 0x10000 2026-03-31T20:40:25.477 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.476+0000 7f98c6154d40 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.477 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.476+0000 7f98c6154d40 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.477 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.476+0000 7f98c6154d40 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.477 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.476+0000 7f98c6154d40 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.477 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.476+0000 7f98c6154d40 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.477 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.476+0000 7f98c6154d40 1 bluefs mount shared_bdev_used = 0 2026-03-31T20:40:25.478 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.476+0000 7f98c6154d40 1 bluestore(./cephtool.bHN) _prepare_db_environment set db_paths to db,91804925952 db.slow,91804925952 2026-03-31T20:40:25.482 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.480+0000 7f98c6154d40 1 bluestore(./cephtool.bHN) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-31T20:40:25.483 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.480+0000 7f98c6154d40 1 bluestore(./cephtool.bHN) _open_super_meta old nid_max 0 2026-03-31T20:40:25.483 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.480+0000 7f98c6154d40 1 bluestore(./cephtool.bHN) _open_super_meta old blobid_max 0 2026-03-31T20:40:25.483 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.480+0000 7f98c6154d40 1 bluestore(./cephtool.bHN) _open_super_meta ondisk_format 4 compat_ondisk_format 3 2026-03-31T20:40:25.483 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.480+0000 7f98c6154d40 1 bluestore(./cephtool.bHN) _open_super_meta min_alloc_size 0x1000 2026-03-31T20:40:25.483 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.480+0000 7f98c6154d40 1 freelist init 2026-03-31T20:40:25.483 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.480+0000 7f98c6154d40 1 freelist _read_cfg 2026-03-31T20:40:25.483 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.480+0000 7f98c6154d40 1 bluestore(./cephtool.bHN) _open_fm effective freelist_type = bitmap, freelist_alloc_size = 0x1000, min_alloc_size = 0x1000 2026-03-31T20:40:25.483 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.480+0000 7f98c6154d40 1 bluestore(./cephtool.bHN) _init_alloc loaded 90 GiB in 1 extents, allocator type hybrid, capacity 0x1680000000, block size 0x1000, free 0x167fffe000, fragmentation 0 2026-03-31T20:40:25.483 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.480+0000 7f98c6154d40 1 bluefs umount 2026-03-31T20:40:25.483 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.480+0000 7f98c6154d40 1 bdev(0x55ab8e907c00 ./cephtool.bHN/block) close 2026-03-31T20:40:25.540 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.540+0000 7f98c6154d40 1 bdev(0x55ab8e907c00 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:25.540 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.540+0000 7f98c6154d40 1 bdev(0x55ab8e907c00 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:25.540 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.540+0000 7f98c6154d40 1 bluefs add_block_device bdev 1 path ./cephtool.bHN/block size 90 GiB 2026-03-31T20:40:25.540 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.540+0000 7f98c6154d40 1 bluefs mount 2026-03-31T20:40:25.540 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.540+0000 7f98c6154d40 1 bluefs _init_alloc shared, id 1, capacity 0x1680000000, block size 0x10000 2026-03-31T20:40:25.540 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.540+0000 7f98c6154d40 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.540 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.540+0000 7f98c6154d40 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.540 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.540+0000 7f98c6154d40 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.540 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.540+0000 7f98c6154d40 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.540 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.540+0000 7f98c6154d40 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.540 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.540+0000 7f98c6154d40 1 bluefs mount shared_bdev_used = 27197440 2026-03-31T20:40:25.540 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.540+0000 7f98c6154d40 1 bluestore(./cephtool.bHN) _prepare_db_environment set db_paths to db,91804925952 db.slow,91804925952 2026-03-31T20:40:25.544 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.544+0000 7f98c6154d40 1 bluestore(./cephtool.bHN) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-31T20:40:25.544 INFO:tasks.workunit.client.0.vm03.stdout:(Bnotexist, ) does not exist 2026-03-31T20:40:25.545 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.544+0000 7f98c6154d40 1 bluefs umount 2026-03-31T20:40:25.545 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.544+0000 7f98c6154d40 1 bdev(0x55ab8e907c00 ./cephtool.bHN/block) close 2026-03-31T20:40:25.584 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.584+0000 7f98c6154d40 1 freelist shutdown 2026-03-31T20:40:25.584 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.584+0000 7f98c6154d40 1 bdev(0x55ab8e907800 ./cephtool.bHN/block) close 2026-03-31T20:40:25.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:18: expect_false: return 0 2026-03-31T20:40:25.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:40: test_ceph_kvstore_tool: ceph-kvstore-tool bluestore-kv ./cephtool.bHN list-crc 2026-03-31T20:40:25.642 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.640+0000 7f1ca8d9ed40 1 bdev(0x55638a6b3800 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:25.643 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.640+0000 7f1ca8d9ed40 1 bdev(0x55638a6b3800 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:25.643 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.640+0000 7f1ca8d9ed40 1 bdev(0x55638a6b3800 ./cephtool.bHN/block) close 2026-03-31T20:40:25.680 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.680+0000 7f1ca8d9ed40 1 bdev(0x55638a6b3800 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:25.680 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.680+0000 7f1ca8d9ed40 1 bdev(0x55638a6b3800 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:25.680 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.680+0000 7f1ca8d9ed40 1 bluestore(./cephtool.bHN) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06 2026-03-31T20:40:25.680 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.680+0000 7f1ca8d9ed40 1 bdev(0x55638a6b3c00 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:25.680 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.680+0000 7f1ca8d9ed40 1 bdev(0x55638a6b3c00 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:25.680 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.680+0000 7f1ca8d9ed40 1 bluefs add_block_device bdev 1 path ./cephtool.bHN/block size 90 GiB 2026-03-31T20:40:25.680 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.680+0000 7f1ca8d9ed40 1 bluefs mount 2026-03-31T20:40:25.680 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.680+0000 7f1ca8d9ed40 1 bluefs _init_alloc shared, id 1, capacity 0x1680000000, block size 0x10000 2026-03-31T20:40:25.681 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.680+0000 7f1ca8d9ed40 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.681 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.680+0000 7f1ca8d9ed40 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.681 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.680+0000 7f1ca8d9ed40 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.681 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.680+0000 7f1ca8d9ed40 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.681 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.680+0000 7f1ca8d9ed40 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.681 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.680+0000 7f1ca8d9ed40 1 bluefs mount shared_bdev_used = 0 2026-03-31T20:40:25.681 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.680+0000 7f1ca8d9ed40 1 bluestore(./cephtool.bHN) _prepare_db_environment set db_paths to db,91804925952 db.slow,91804925952 2026-03-31T20:40:25.686 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.684+0000 7f1ca8d9ed40 1 bluestore(./cephtool.bHN) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-31T20:40:25.686 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.684+0000 7f1ca8d9ed40 1 bluestore(./cephtool.bHN) _open_super_meta old nid_max 0 2026-03-31T20:40:25.686 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.684+0000 7f1ca8d9ed40 1 bluestore(./cephtool.bHN) _open_super_meta old blobid_max 0 2026-03-31T20:40:25.686 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.684+0000 7f1ca8d9ed40 1 bluestore(./cephtool.bHN) _open_super_meta ondisk_format 4 compat_ondisk_format 3 2026-03-31T20:40:25.686 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.684+0000 7f1ca8d9ed40 1 bluestore(./cephtool.bHN) _open_super_meta min_alloc_size 0x1000 2026-03-31T20:40:25.687 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.684+0000 7f1ca8d9ed40 1 freelist init 2026-03-31T20:40:25.687 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.684+0000 7f1ca8d9ed40 1 freelist _read_cfg 2026-03-31T20:40:25.687 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.684+0000 7f1ca8d9ed40 1 bluestore(./cephtool.bHN) _open_fm effective freelist_type = bitmap, freelist_alloc_size = 0x1000, min_alloc_size = 0x1000 2026-03-31T20:40:25.687 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.684+0000 7f1ca8d9ed40 1 bluestore(./cephtool.bHN) _init_alloc loaded 90 GiB in 1 extents, allocator type hybrid, capacity 0x1680000000, block size 0x1000, free 0x167fffe000, fragmentation 0 2026-03-31T20:40:25.687 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.684+0000 7f1ca8d9ed40 1 bluefs umount 2026-03-31T20:40:25.687 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.684+0000 7f1ca8d9ed40 1 bdev(0x55638a6b3c00 ./cephtool.bHN/block) close 2026-03-31T20:40:25.715 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.716+0000 7f1ca8d9ed40 1 bdev(0x55638a6b3c00 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:25.716 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.716+0000 7f1ca8d9ed40 1 bdev(0x55638a6b3c00 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:25.716 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.716+0000 7f1ca8d9ed40 1 bluefs add_block_device bdev 1 path ./cephtool.bHN/block size 90 GiB 2026-03-31T20:40:25.716 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.716+0000 7f1ca8d9ed40 1 bluefs mount 2026-03-31T20:40:25.716 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.716+0000 7f1ca8d9ed40 1 bluefs _init_alloc shared, id 1, capacity 0x1680000000, block size 0x10000 2026-03-31T20:40:25.716 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.716+0000 7f1ca8d9ed40 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.716 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.716+0000 7f1ca8d9ed40 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.716 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.716+0000 7f1ca8d9ed40 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.716 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.716+0000 7f1ca8d9ed40 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.716 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.716+0000 7f1ca8d9ed40 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.716 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.716+0000 7f1ca8d9ed40 1 bluefs mount shared_bdev_used = 27197440 2026-03-31T20:40:25.716 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.716+0000 7f1ca8d9ed40 1 bluestore(./cephtool.bHN) _prepare_db_environment set db_paths to db,91804925952 db.slow,91804925952 2026-03-31T20:40:25.720 INFO:tasks.workunit.client.0.vm03.stdout:B blocks 3495804298 2026-03-31T20:40:25.720 INFO:tasks.workunit.client.0.vm03.stdout:B blocks_per_key 2356384685 2026-03-31T20:40:25.720 INFO:tasks.workunit.client.0.vm03.stdout:B bytes_per_block 1745006751 2026-03-31T20:40:25.720 INFO:tasks.workunit.client.0.vm03.stdout:B size 2876749260 2026-03-31T20:40:25.721 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.720+0000 7f1ca8d9ed40 1 bluestore(./cephtool.bHN) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-31T20:40:25.721 INFO:tasks.workunit.client.0.vm03.stdout:S blobid_max 3417961130 2026-03-31T20:40:25.721 INFO:tasks.workunit.client.0.vm03.stdout:S freelist_type 1105442534 2026-03-31T20:40:25.721 INFO:tasks.workunit.client.0.vm03.stdout:S min_alloc_size 1458662255 2026-03-31T20:40:25.721 INFO:tasks.workunit.client.0.vm03.stdout:S min_compat_ondisk_format 2796427507 2026-03-31T20:40:25.721 INFO:tasks.workunit.client.0.vm03.stdout:S nid_max 1692495819 2026-03-31T20:40:25.721 INFO:tasks.workunit.client.0.vm03.stdout:S ondisk_format 3256356326 2026-03-31T20:40:25.721 INFO:tasks.workunit.client.0.vm03.stdout:S per_pool_omap 490065073 2026-03-31T20:40:25.721 INFO:tasks.workunit.client.0.vm03.stdout:b %00%00%00%00%00%00%00%00 925950599 2026-03-31T20:40:25.721 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.720+0000 7f1ca8d9ed40 1 bluefs umount 2026-03-31T20:40:25.721 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.720+0000 7f1ca8d9ed40 1 bdev(0x55638a6b3c00 ./cephtool.bHN/block) close 2026-03-31T20:40:25.755 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.756+0000 7f1ca8d9ed40 1 freelist shutdown 2026-03-31T20:40:25.756 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.756+0000 7f1ca8d9ed40 1 bdev(0x55638a6b3800 ./cephtool.bHN/block) close 2026-03-31T20:40:25.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:41: test_ceph_kvstore_tool: ceph-kvstore-tool bluestore-kv ./cephtool.bHN list-crc B 2026-03-31T20:40:25.807 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.804+0000 7f9898318d40 1 bdev(0x55633996d800 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:25.807 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.804+0000 7f9898318d40 1 bdev(0x55633996d800 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:25.807 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.804+0000 7f9898318d40 1 bdev(0x55633996d800 ./cephtool.bHN/block) close 2026-03-31T20:40:25.840 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.840+0000 7f9898318d40 1 bdev(0x55633996d800 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:25.840 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.840+0000 7f9898318d40 1 bdev(0x55633996d800 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:25.840 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.840+0000 7f9898318d40 1 bluestore(./cephtool.bHN) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06 2026-03-31T20:40:25.840 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.840+0000 7f9898318d40 1 bdev(0x55633996dc00 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:25.840 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.840+0000 7f9898318d40 1 bdev(0x55633996dc00 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:25.840 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.840+0000 7f9898318d40 1 bluefs add_block_device bdev 1 path ./cephtool.bHN/block size 90 GiB 2026-03-31T20:40:25.840 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.840+0000 7f9898318d40 1 bluefs mount 2026-03-31T20:40:25.840 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.840+0000 7f9898318d40 1 bluefs _init_alloc shared, id 1, capacity 0x1680000000, block size 0x10000 2026-03-31T20:40:25.841 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.840+0000 7f9898318d40 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.841 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.840+0000 7f9898318d40 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.841 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.840+0000 7f9898318d40 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.841 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.840+0000 7f9898318d40 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.841 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.840+0000 7f9898318d40 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.841 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.840+0000 7f9898318d40 1 bluefs mount shared_bdev_used = 0 2026-03-31T20:40:25.841 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.840+0000 7f9898318d40 1 bluestore(./cephtool.bHN) _prepare_db_environment set db_paths to db,91804925952 db.slow,91804925952 2026-03-31T20:40:25.846 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.844+0000 7f9898318d40 1 bluestore(./cephtool.bHN) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-31T20:40:25.846 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.844+0000 7f9898318d40 1 bluestore(./cephtool.bHN) _open_super_meta old nid_max 0 2026-03-31T20:40:25.846 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.844+0000 7f9898318d40 1 bluestore(./cephtool.bHN) _open_super_meta old blobid_max 0 2026-03-31T20:40:25.846 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.844+0000 7f9898318d40 1 bluestore(./cephtool.bHN) _open_super_meta ondisk_format 4 compat_ondisk_format 3 2026-03-31T20:40:25.846 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.844+0000 7f9898318d40 1 bluestore(./cephtool.bHN) _open_super_meta min_alloc_size 0x1000 2026-03-31T20:40:25.847 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.844+0000 7f9898318d40 1 freelist init 2026-03-31T20:40:25.847 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.844+0000 7f9898318d40 1 freelist _read_cfg 2026-03-31T20:40:25.847 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.844+0000 7f9898318d40 1 bluestore(./cephtool.bHN) _open_fm effective freelist_type = bitmap, freelist_alloc_size = 0x1000, min_alloc_size = 0x1000 2026-03-31T20:40:25.847 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.844+0000 7f9898318d40 1 bluestore(./cephtool.bHN) _init_alloc loaded 90 GiB in 1 extents, allocator type hybrid, capacity 0x1680000000, block size 0x1000, free 0x167fffe000, fragmentation 0 2026-03-31T20:40:25.847 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.844+0000 7f9898318d40 1 bluefs umount 2026-03-31T20:40:25.847 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.844+0000 7f9898318d40 1 bdev(0x55633996dc00 ./cephtool.bHN/block) close 2026-03-31T20:40:25.891 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.892+0000 7f9898318d40 1 bdev(0x55633996dc00 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:25.892 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.892+0000 7f9898318d40 1 bdev(0x55633996dc00 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:25.892 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.892+0000 7f9898318d40 1 bluefs add_block_device bdev 1 path ./cephtool.bHN/block size 90 GiB 2026-03-31T20:40:25.892 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.892+0000 7f9898318d40 1 bluefs mount 2026-03-31T20:40:25.892 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.892+0000 7f9898318d40 1 bluefs _init_alloc shared, id 1, capacity 0x1680000000, block size 0x10000 2026-03-31T20:40:25.892 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.892+0000 7f9898318d40 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.892 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.892+0000 7f9898318d40 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.892 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.892+0000 7f9898318d40 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.892 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.892+0000 7f9898318d40 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.892 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.892+0000 7f9898318d40 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:25.892 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.892+0000 7f9898318d40 1 bluefs mount shared_bdev_used = 27197440 2026-03-31T20:40:25.892 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.892+0000 7f9898318d40 1 bluestore(./cephtool.bHN) _prepare_db_environment set db_paths to db,91804925952 db.slow,91804925952 2026-03-31T20:40:25.896 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.896+0000 7f9898318d40 1 bluestore(./cephtool.bHN) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-31T20:40:25.896 INFO:tasks.workunit.client.0.vm03.stdout:B blocks 3495804298 2026-03-31T20:40:25.896 INFO:tasks.workunit.client.0.vm03.stdout:B blocks_per_key 2356384685 2026-03-31T20:40:25.896 INFO:tasks.workunit.client.0.vm03.stdout:B bytes_per_block 1745006751 2026-03-31T20:40:25.896 INFO:tasks.workunit.client.0.vm03.stdout:B size 2876749260 2026-03-31T20:40:25.896 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.896+0000 7f9898318d40 1 bluefs umount 2026-03-31T20:40:25.896 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.896+0000 7f9898318d40 1 bdev(0x55633996dc00 ./cephtool.bHN/block) close 2026-03-31T20:40:25.939 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.940+0000 7f9898318d40 1 freelist shutdown 2026-03-31T20:40:25.940 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:25.940+0000 7f9898318d40 1 bdev(0x55633996d800 ./cephtool.bHN/block) close 2026-03-31T20:40:25.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:44: test_ceph_kvstore_tool: ceph-kvstore-tool bluestore-kv ./cephtool.bHN list B 2026-03-31T20:40:26.003 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.000+0000 7f6236acad40 1 bdev(0x55bece1bd800 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:26.003 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.000+0000 7f6236acad40 1 bdev(0x55bece1bd800 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:26.003 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.000+0000 7f6236acad40 1 bdev(0x55bece1bd800 ./cephtool.bHN/block) close 2026-03-31T20:40:26.024 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.024+0000 7f6236acad40 1 bdev(0x55bece1bd800 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:26.024 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.024+0000 7f6236acad40 1 bdev(0x55bece1bd800 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:26.024 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.024+0000 7f6236acad40 1 bluestore(./cephtool.bHN) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06 2026-03-31T20:40:26.024 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.024+0000 7f6236acad40 1 bdev(0x55bece1bdc00 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:26.024 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.024+0000 7f6236acad40 1 bdev(0x55bece1bdc00 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:26.024 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.024+0000 7f6236acad40 1 bluefs add_block_device bdev 1 path ./cephtool.bHN/block size 90 GiB 2026-03-31T20:40:26.024 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.024+0000 7f6236acad40 1 bluefs mount 2026-03-31T20:40:26.024 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.024+0000 7f6236acad40 1 bluefs _init_alloc shared, id 1, capacity 0x1680000000, block size 0x10000 2026-03-31T20:40:26.025 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.024+0000 7f6236acad40 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.025 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.024+0000 7f6236acad40 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.025 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.024+0000 7f6236acad40 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.025 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.024+0000 7f6236acad40 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.025 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.024+0000 7f6236acad40 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.025 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.024+0000 7f6236acad40 1 bluefs mount shared_bdev_used = 0 2026-03-31T20:40:26.025 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.024+0000 7f6236acad40 1 bluestore(./cephtool.bHN) _prepare_db_environment set db_paths to db,91804925952 db.slow,91804925952 2026-03-31T20:40:26.030 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.028+0000 7f6236acad40 1 bluestore(./cephtool.bHN) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-31T20:40:26.030 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.028+0000 7f6236acad40 1 bluestore(./cephtool.bHN) _open_super_meta old nid_max 0 2026-03-31T20:40:26.030 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.028+0000 7f6236acad40 1 bluestore(./cephtool.bHN) _open_super_meta old blobid_max 0 2026-03-31T20:40:26.030 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.028+0000 7f6236acad40 1 bluestore(./cephtool.bHN) _open_super_meta ondisk_format 4 compat_ondisk_format 3 2026-03-31T20:40:26.030 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.028+0000 7f6236acad40 1 bluestore(./cephtool.bHN) _open_super_meta min_alloc_size 0x1000 2026-03-31T20:40:26.031 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.028+0000 7f6236acad40 1 freelist init 2026-03-31T20:40:26.031 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.028+0000 7f6236acad40 1 freelist _read_cfg 2026-03-31T20:40:26.031 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.028+0000 7f6236acad40 1 bluestore(./cephtool.bHN) _open_fm effective freelist_type = bitmap, freelist_alloc_size = 0x1000, min_alloc_size = 0x1000 2026-03-31T20:40:26.031 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.028+0000 7f6236acad40 1 bluestore(./cephtool.bHN) _init_alloc loaded 90 GiB in 1 extents, allocator type hybrid, capacity 0x1680000000, block size 0x1000, free 0x167fffe000, fragmentation 0 2026-03-31T20:40:26.031 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.028+0000 7f6236acad40 1 bluefs umount 2026-03-31T20:40:26.031 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.028+0000 7f6236acad40 1 bdev(0x55bece1bdc00 ./cephtool.bHN/block) close 2026-03-31T20:40:26.064 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.064+0000 7f6236acad40 1 bdev(0x55bece1bdc00 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:26.064 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.064+0000 7f6236acad40 1 bdev(0x55bece1bdc00 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:26.064 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.064+0000 7f6236acad40 1 bluefs add_block_device bdev 1 path ./cephtool.bHN/block size 90 GiB 2026-03-31T20:40:26.064 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.064+0000 7f6236acad40 1 bluefs mount 2026-03-31T20:40:26.064 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.064+0000 7f6236acad40 1 bluefs _init_alloc shared, id 1, capacity 0x1680000000, block size 0x10000 2026-03-31T20:40:26.064 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.064+0000 7f6236acad40 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.064 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.064+0000 7f6236acad40 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.064 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.064+0000 7f6236acad40 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.064 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.064+0000 7f6236acad40 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.064 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.064+0000 7f6236acad40 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.064 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.064+0000 7f6236acad40 1 bluefs mount shared_bdev_used = 27197440 2026-03-31T20:40:26.064 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.064+0000 7f6236acad40 1 bluestore(./cephtool.bHN) _prepare_db_environment set db_paths to db,91804925952 db.slow,91804925952 2026-03-31T20:40:26.068 INFO:tasks.workunit.client.0.vm03.stdout:B blocks 2026-03-31T20:40:26.068 INFO:tasks.workunit.client.0.vm03.stdout:B blocks_per_key 2026-03-31T20:40:26.068 INFO:tasks.workunit.client.0.vm03.stdout:B bytes_per_block 2026-03-31T20:40:26.068 INFO:tasks.workunit.client.0.vm03.stdout:B size 2026-03-31T20:40:26.068 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.068+0000 7f6236acad40 1 bluestore(./cephtool.bHN) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-31T20:40:26.068 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.068+0000 7f6236acad40 1 bluefs umount 2026-03-31T20:40:26.068 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.068+0000 7f6236acad40 1 bdev(0x55bece1bdc00 ./cephtool.bHN/block) close 2026-03-31T20:40:26.115 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.116+0000 7f6236acad40 1 freelist shutdown 2026-03-31T20:40:26.116 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.116+0000 7f6236acad40 1 bdev(0x55bece1bd800 ./cephtool.bHN/block) close 2026-03-31T20:40:26.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:47: test_ceph_kvstore_tool: echo helloworld 2026-03-31T20:40:26.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:48: test_ceph_kvstore_tool: ceph-kvstore-tool bluestore-kv ./cephtool.bHN set TESTPREFIX TESTKEY in ./cephtool.bHN/test_invalid.w66 2026-03-31T20:40:26.172 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.172+0000 7efcec172d40 1 bdev(0x5654d78cd800 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:26.172 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.172+0000 7efcec172d40 1 bdev(0x5654d78cd800 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:26.172 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.172+0000 7efcec172d40 1 bdev(0x5654d78cd800 ./cephtool.bHN/block) close 2026-03-31T20:40:26.208 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.208+0000 7efcec172d40 1 bdev(0x5654d78cd800 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:26.208 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.208+0000 7efcec172d40 1 bdev(0x5654d78cd800 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:26.208 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.208+0000 7efcec172d40 1 bluestore(./cephtool.bHN) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06 2026-03-31T20:40:26.208 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.208+0000 7efcec172d40 1 bdev(0x5654d78cdc00 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:26.208 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.208+0000 7efcec172d40 1 bdev(0x5654d78cdc00 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:26.208 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.208+0000 7efcec172d40 1 bluefs add_block_device bdev 1 path ./cephtool.bHN/block size 90 GiB 2026-03-31T20:40:26.208 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.208+0000 7efcec172d40 1 bluefs mount 2026-03-31T20:40:26.208 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.208+0000 7efcec172d40 1 bluefs _init_alloc shared, id 1, capacity 0x1680000000, block size 0x10000 2026-03-31T20:40:26.209 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.208+0000 7efcec172d40 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.209 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.208+0000 7efcec172d40 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.209 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.208+0000 7efcec172d40 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.209 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.208+0000 7efcec172d40 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.209 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.208+0000 7efcec172d40 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.209 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.208+0000 7efcec172d40 1 bluefs mount shared_bdev_used = 0 2026-03-31T20:40:26.209 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.208+0000 7efcec172d40 1 bluestore(./cephtool.bHN) _prepare_db_environment set db_paths to db,91804925952 db.slow,91804925952 2026-03-31T20:40:26.214 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.212+0000 7efcec172d40 1 bluestore(./cephtool.bHN) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-31T20:40:26.214 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.212+0000 7efcec172d40 1 bluestore(./cephtool.bHN) _open_super_meta old nid_max 0 2026-03-31T20:40:26.214 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.212+0000 7efcec172d40 1 bluestore(./cephtool.bHN) _open_super_meta old blobid_max 0 2026-03-31T20:40:26.214 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.212+0000 7efcec172d40 1 bluestore(./cephtool.bHN) _open_super_meta ondisk_format 4 compat_ondisk_format 3 2026-03-31T20:40:26.214 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.212+0000 7efcec172d40 1 bluestore(./cephtool.bHN) _open_super_meta min_alloc_size 0x1000 2026-03-31T20:40:26.214 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.212+0000 7efcec172d40 1 freelist init 2026-03-31T20:40:26.214 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.212+0000 7efcec172d40 1 freelist _read_cfg 2026-03-31T20:40:26.215 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.212+0000 7efcec172d40 1 bluestore(./cephtool.bHN) _open_fm effective freelist_type = bitmap, freelist_alloc_size = 0x1000, min_alloc_size = 0x1000 2026-03-31T20:40:26.215 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.212+0000 7efcec172d40 1 bluestore(./cephtool.bHN) _init_alloc loaded 90 GiB in 1 extents, allocator type hybrid, capacity 0x1680000000, block size 0x1000, free 0x167fffe000, fragmentation 0 2026-03-31T20:40:26.215 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.212+0000 7efcec172d40 1 bluefs umount 2026-03-31T20:40:26.215 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.212+0000 7efcec172d40 1 bdev(0x5654d78cdc00 ./cephtool.bHN/block) close 2026-03-31T20:40:26.260 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.260+0000 7efcec172d40 1 bdev(0x5654d78cdc00 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:26.260 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.260+0000 7efcec172d40 1 bdev(0x5654d78cdc00 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:26.260 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.260+0000 7efcec172d40 1 bluefs add_block_device bdev 1 path ./cephtool.bHN/block size 90 GiB 2026-03-31T20:40:26.260 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.260+0000 7efcec172d40 1 bluefs mount 2026-03-31T20:40:26.260 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.260+0000 7efcec172d40 1 bluefs _init_alloc shared, id 1, capacity 0x1680000000, block size 0x10000 2026-03-31T20:40:26.260 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.260+0000 7efcec172d40 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.260 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.260+0000 7efcec172d40 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.260 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.260+0000 7efcec172d40 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.260 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.260+0000 7efcec172d40 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.260 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.260+0000 7efcec172d40 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.260 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.260+0000 7efcec172d40 1 bluefs mount shared_bdev_used = 27197440 2026-03-31T20:40:26.260 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.260+0000 7efcec172d40 1 bluestore(./cephtool.bHN) _prepare_db_environment set db_paths to db,91804925952 db.slow,91804925952 2026-03-31T20:40:26.272 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.272+0000 7efcec172d40 1 bluestore(./cephtool.bHN) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-31T20:40:26.273 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.272+0000 7efcec172d40 1 bluefs umount 2026-03-31T20:40:26.273 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.272+0000 7efcec172d40 1 bdev(0x5654d78cdc00 ./cephtool.bHN/block) close 2026-03-31T20:40:26.315 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.316+0000 7efcec172d40 1 freelist shutdown 2026-03-31T20:40:26.316 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.316+0000 7efcec172d40 1 bdev(0x5654d78cd800 ./cephtool.bHN/block) close 2026-03-31T20:40:26.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:49: test_ceph_kvstore_tool: ceph-kvstore-tool bluestore-kv ./cephtool.bHN exists TESTPREFIX TESTKEY 2026-03-31T20:40:26.371 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.368+0000 7f8a0fc7ed40 1 bdev(0x55b1da963800 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:26.371 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.368+0000 7f8a0fc7ed40 1 bdev(0x55b1da963800 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:26.371 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.368+0000 7f8a0fc7ed40 1 bdev(0x55b1da963800 ./cephtool.bHN/block) close 2026-03-31T20:40:26.404 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.404+0000 7f8a0fc7ed40 1 bdev(0x55b1da963800 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:26.404 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.404+0000 7f8a0fc7ed40 1 bdev(0x55b1da963800 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:26.404 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.404+0000 7f8a0fc7ed40 1 bluestore(./cephtool.bHN) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06 2026-03-31T20:40:26.404 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.404+0000 7f8a0fc7ed40 1 bdev(0x55b1da963c00 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:26.404 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.404+0000 7f8a0fc7ed40 1 bdev(0x55b1da963c00 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:26.404 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.404+0000 7f8a0fc7ed40 1 bluefs add_block_device bdev 1 path ./cephtool.bHN/block size 90 GiB 2026-03-31T20:40:26.404 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.404+0000 7f8a0fc7ed40 1 bluefs mount 2026-03-31T20:40:26.404 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.404+0000 7f8a0fc7ed40 1 bluefs _init_alloc shared, id 1, capacity 0x1680000000, block size 0x10000 2026-03-31T20:40:26.405 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.404+0000 7f8a0fc7ed40 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.405 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.404+0000 7f8a0fc7ed40 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.405 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.404+0000 7f8a0fc7ed40 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.405 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.404+0000 7f8a0fc7ed40 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.405 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.404+0000 7f8a0fc7ed40 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.405 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.404+0000 7f8a0fc7ed40 1 bluefs mount shared_bdev_used = 0 2026-03-31T20:40:26.405 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.404+0000 7f8a0fc7ed40 1 bluestore(./cephtool.bHN) _prepare_db_environment set db_paths to db,91804925952 db.slow,91804925952 2026-03-31T20:40:26.410 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.408+0000 7f8a0fc7ed40 1 bluestore(./cephtool.bHN) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-31T20:40:26.410 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.408+0000 7f8a0fc7ed40 1 bluestore(./cephtool.bHN) _open_super_meta old nid_max 0 2026-03-31T20:40:26.410 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.408+0000 7f8a0fc7ed40 1 bluestore(./cephtool.bHN) _open_super_meta old blobid_max 0 2026-03-31T20:40:26.410 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.408+0000 7f8a0fc7ed40 1 bluestore(./cephtool.bHN) _open_super_meta ondisk_format 4 compat_ondisk_format 3 2026-03-31T20:40:26.410 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.408+0000 7f8a0fc7ed40 1 bluestore(./cephtool.bHN) _open_super_meta min_alloc_size 0x1000 2026-03-31T20:40:26.411 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.408+0000 7f8a0fc7ed40 1 freelist init 2026-03-31T20:40:26.411 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.408+0000 7f8a0fc7ed40 1 freelist _read_cfg 2026-03-31T20:40:26.411 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.408+0000 7f8a0fc7ed40 1 bluestore(./cephtool.bHN) _open_fm effective freelist_type = bitmap, freelist_alloc_size = 0x1000, min_alloc_size = 0x1000 2026-03-31T20:40:26.411 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.408+0000 7f8a0fc7ed40 1 bluestore(./cephtool.bHN) _init_alloc loaded 90 GiB in 1 extents, allocator type hybrid, capacity 0x1680000000, block size 0x1000, free 0x167fffe000, fragmentation 0 2026-03-31T20:40:26.411 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.408+0000 7f8a0fc7ed40 1 bluefs umount 2026-03-31T20:40:26.411 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.408+0000 7f8a0fc7ed40 1 bdev(0x55b1da963c00 ./cephtool.bHN/block) close 2026-03-31T20:40:26.444 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.444+0000 7f8a0fc7ed40 1 bdev(0x55b1da963c00 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:26.444 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.444+0000 7f8a0fc7ed40 1 bdev(0x55b1da963c00 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:26.444 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.444+0000 7f8a0fc7ed40 1 bluefs add_block_device bdev 1 path ./cephtool.bHN/block size 90 GiB 2026-03-31T20:40:26.444 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.444+0000 7f8a0fc7ed40 1 bluefs mount 2026-03-31T20:40:26.444 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.444+0000 7f8a0fc7ed40 1 bluefs _init_alloc shared, id 1, capacity 0x1680000000, block size 0x10000 2026-03-31T20:40:26.444 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.444+0000 7f8a0fc7ed40 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.444 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.444+0000 7f8a0fc7ed40 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.444 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.444+0000 7f8a0fc7ed40 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.444 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.444+0000 7f8a0fc7ed40 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.444 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.444+0000 7f8a0fc7ed40 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.444 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.444+0000 7f8a0fc7ed40 1 bluefs mount shared_bdev_used = 27262976 2026-03-31T20:40:26.444 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.444+0000 7f8a0fc7ed40 1 bluestore(./cephtool.bHN) _prepare_db_environment set db_paths to db,91804925952 db.slow,91804925952 2026-03-31T20:40:26.448 INFO:tasks.workunit.client.0.vm03.stdout:(TESTPREFIX, TESTKEY) exists 2026-03-31T20:40:26.448 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.448+0000 7f8a0fc7ed40 1 bluestore(./cephtool.bHN) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-31T20:40:26.448 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.448+0000 7f8a0fc7ed40 1 bluefs umount 2026-03-31T20:40:26.448 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.448+0000 7f8a0fc7ed40 1 bdev(0x55b1da963c00 ./cephtool.bHN/block) close 2026-03-31T20:40:26.499 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.500+0000 7f8a0fc7ed40 1 freelist shutdown 2026-03-31T20:40:26.500 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.500+0000 7f8a0fc7ed40 1 bdev(0x55b1da963800 ./cephtool.bHN/block) close 2026-03-31T20:40:26.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:52: test_ceph_kvstore_tool: ceph-kvstore-tool bluestore-kv ./cephtool.bHN get TESTPREFIX TESTKEY out ./cephtool.bHN/test_invalid.w66.bak 2026-03-31T20:40:26.559 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.556+0000 7fc962da9d40 1 bdev(0x5571fe847800 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:26.560 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.560+0000 7fc962da9d40 1 bdev(0x5571fe847800 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:26.560 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.560+0000 7fc962da9d40 1 bdev(0x5571fe847800 ./cephtool.bHN/block) close 2026-03-31T20:40:26.592 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.592+0000 7fc962da9d40 1 bdev(0x5571fe847800 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:26.592 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.592+0000 7fc962da9d40 1 bdev(0x5571fe847800 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:26.592 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.592+0000 7fc962da9d40 1 bluestore(./cephtool.bHN) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06 2026-03-31T20:40:26.592 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.592+0000 7fc962da9d40 1 bdev(0x5571fe847c00 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:26.592 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.592+0000 7fc962da9d40 1 bdev(0x5571fe847c00 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:26.592 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.592+0000 7fc962da9d40 1 bluefs add_block_device bdev 1 path ./cephtool.bHN/block size 90 GiB 2026-03-31T20:40:26.592 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.592+0000 7fc962da9d40 1 bluefs mount 2026-03-31T20:40:26.592 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.592+0000 7fc962da9d40 1 bluefs _init_alloc shared, id 1, capacity 0x1680000000, block size 0x10000 2026-03-31T20:40:26.593 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.592+0000 7fc962da9d40 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.593 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.592+0000 7fc962da9d40 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.593 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.592+0000 7fc962da9d40 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.593 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.592+0000 7fc962da9d40 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.593 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.592+0000 7fc962da9d40 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.593 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.592+0000 7fc962da9d40 1 bluefs mount shared_bdev_used = 0 2026-03-31T20:40:26.593 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.592+0000 7fc962da9d40 1 bluestore(./cephtool.bHN) _prepare_db_environment set db_paths to db,91804925952 db.slow,91804925952 2026-03-31T20:40:26.598 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.596+0000 7fc962da9d40 1 bluestore(./cephtool.bHN) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-31T20:40:26.598 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.596+0000 7fc962da9d40 1 bluestore(./cephtool.bHN) _open_super_meta old nid_max 0 2026-03-31T20:40:26.598 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.596+0000 7fc962da9d40 1 bluestore(./cephtool.bHN) _open_super_meta old blobid_max 0 2026-03-31T20:40:26.598 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.596+0000 7fc962da9d40 1 bluestore(./cephtool.bHN) _open_super_meta ondisk_format 4 compat_ondisk_format 3 2026-03-31T20:40:26.598 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.596+0000 7fc962da9d40 1 bluestore(./cephtool.bHN) _open_super_meta min_alloc_size 0x1000 2026-03-31T20:40:26.599 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.596+0000 7fc962da9d40 1 freelist init 2026-03-31T20:40:26.599 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.596+0000 7fc962da9d40 1 freelist _read_cfg 2026-03-31T20:40:26.599 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.596+0000 7fc962da9d40 1 bluestore(./cephtool.bHN) _open_fm effective freelist_type = bitmap, freelist_alloc_size = 0x1000, min_alloc_size = 0x1000 2026-03-31T20:40:26.599 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.596+0000 7fc962da9d40 1 bluestore(./cephtool.bHN) _init_alloc loaded 90 GiB in 1 extents, allocator type hybrid, capacity 0x1680000000, block size 0x1000, free 0x167fffe000, fragmentation 0 2026-03-31T20:40:26.599 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.596+0000 7fc962da9d40 1 bluefs umount 2026-03-31T20:40:26.599 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.596+0000 7fc962da9d40 1 bdev(0x5571fe847c00 ./cephtool.bHN/block) close 2026-03-31T20:40:26.639 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.640+0000 7fc962da9d40 1 bdev(0x5571fe847c00 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:26.640 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.640+0000 7fc962da9d40 1 bdev(0x5571fe847c00 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:26.640 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.640+0000 7fc962da9d40 1 bluefs add_block_device bdev 1 path ./cephtool.bHN/block size 90 GiB 2026-03-31T20:40:26.640 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.640+0000 7fc962da9d40 1 bluefs mount 2026-03-31T20:40:26.640 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.640+0000 7fc962da9d40 1 bluefs _init_alloc shared, id 1, capacity 0x1680000000, block size 0x10000 2026-03-31T20:40:26.640 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.640+0000 7fc962da9d40 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.640 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.640+0000 7fc962da9d40 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.640 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.640+0000 7fc962da9d40 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.640 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.640+0000 7fc962da9d40 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.640 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.640+0000 7fc962da9d40 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.640 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.640+0000 7fc962da9d40 1 bluefs mount shared_bdev_used = 27262976 2026-03-31T20:40:26.640 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.640+0000 7fc962da9d40 1 bluestore(./cephtool.bHN) _prepare_db_environment set db_paths to db,91804925952 db.slow,91804925952 2026-03-31T20:40:26.644 INFO:tasks.workunit.client.0.vm03.stdout:(TESTPREFIX, TESTKEY) 2026-03-31T20:40:26.644 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.644+0000 7fc962da9d40 1 bluestore(./cephtool.bHN) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-31T20:40:26.644 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.644+0000 7fc962da9d40 1 bluefs umount 2026-03-31T20:40:26.644 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.644+0000 7fc962da9d40 1 bdev(0x5571fe847c00 ./cephtool.bHN/block) close 2026-03-31T20:40:26.687 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.688+0000 7fc962da9d40 1 freelist shutdown 2026-03-31T20:40:26.688 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.688+0000 7fc962da9d40 1 bdev(0x5571fe847800 ./cephtool.bHN/block) close 2026-03-31T20:40:26.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:53: test_ceph_kvstore_tool: diff ./cephtool.bHN/test_invalid.w66 ./cephtool.bHN/test_invalid.w66.bak 2026-03-31T20:40:26.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:56: test_ceph_kvstore_tool: ceph-kvstore-tool bluestore-kv ./cephtool.bHN rm TESTPREFIX TESTKEY 2026-03-31T20:40:26.748 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.748+0000 7fe9d1b97d40 1 bdev(0x5601852c1800 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:26.748 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.748+0000 7fe9d1b97d40 1 bdev(0x5601852c1800 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:26.748 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.748+0000 7fe9d1b97d40 1 bdev(0x5601852c1800 ./cephtool.bHN/block) close 2026-03-31T20:40:26.783 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.784+0000 7fe9d1b97d40 1 bdev(0x5601852c1800 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:26.784 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.784+0000 7fe9d1b97d40 1 bdev(0x5601852c1800 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:26.784 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.784+0000 7fe9d1b97d40 1 bluestore(./cephtool.bHN) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06 2026-03-31T20:40:26.784 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.784+0000 7fe9d1b97d40 1 bdev(0x5601852c1c00 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:26.784 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.784+0000 7fe9d1b97d40 1 bdev(0x5601852c1c00 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:26.784 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.784+0000 7fe9d1b97d40 1 bluefs add_block_device bdev 1 path ./cephtool.bHN/block size 90 GiB 2026-03-31T20:40:26.784 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.784+0000 7fe9d1b97d40 1 bluefs mount 2026-03-31T20:40:26.784 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.784+0000 7fe9d1b97d40 1 bluefs _init_alloc shared, id 1, capacity 0x1680000000, block size 0x10000 2026-03-31T20:40:26.785 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.784+0000 7fe9d1b97d40 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.785 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.784+0000 7fe9d1b97d40 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.785 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.784+0000 7fe9d1b97d40 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.785 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.784+0000 7fe9d1b97d40 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.785 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.784+0000 7fe9d1b97d40 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.785 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.784+0000 7fe9d1b97d40 1 bluefs mount shared_bdev_used = 0 2026-03-31T20:40:26.785 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.784+0000 7fe9d1b97d40 1 bluestore(./cephtool.bHN) _prepare_db_environment set db_paths to db,91804925952 db.slow,91804925952 2026-03-31T20:40:26.790 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.788+0000 7fe9d1b97d40 1 bluestore(./cephtool.bHN) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-31T20:40:26.790 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.788+0000 7fe9d1b97d40 1 bluestore(./cephtool.bHN) _open_super_meta old nid_max 0 2026-03-31T20:40:26.790 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.788+0000 7fe9d1b97d40 1 bluestore(./cephtool.bHN) _open_super_meta old blobid_max 0 2026-03-31T20:40:26.790 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.788+0000 7fe9d1b97d40 1 bluestore(./cephtool.bHN) _open_super_meta ondisk_format 4 compat_ondisk_format 3 2026-03-31T20:40:26.790 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.788+0000 7fe9d1b97d40 1 bluestore(./cephtool.bHN) _open_super_meta min_alloc_size 0x1000 2026-03-31T20:40:26.791 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.788+0000 7fe9d1b97d40 1 freelist init 2026-03-31T20:40:26.791 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.788+0000 7fe9d1b97d40 1 freelist _read_cfg 2026-03-31T20:40:26.791 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.788+0000 7fe9d1b97d40 1 bluestore(./cephtool.bHN) _open_fm effective freelist_type = bitmap, freelist_alloc_size = 0x1000, min_alloc_size = 0x1000 2026-03-31T20:40:26.791 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.788+0000 7fe9d1b97d40 1 bluestore(./cephtool.bHN) _init_alloc loaded 90 GiB in 1 extents, allocator type hybrid, capacity 0x1680000000, block size 0x1000, free 0x167fffe000, fragmentation 0 2026-03-31T20:40:26.791 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.788+0000 7fe9d1b97d40 1 bluefs umount 2026-03-31T20:40:26.791 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.788+0000 7fe9d1b97d40 1 bdev(0x5601852c1c00 ./cephtool.bHN/block) close 2026-03-31T20:40:26.824 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.824+0000 7fe9d1b97d40 1 bdev(0x5601852c1c00 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:26.824 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.824+0000 7fe9d1b97d40 1 bdev(0x5601852c1c00 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:26.824 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.824+0000 7fe9d1b97d40 1 bluefs add_block_device bdev 1 path ./cephtool.bHN/block size 90 GiB 2026-03-31T20:40:26.824 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.824+0000 7fe9d1b97d40 1 bluefs mount 2026-03-31T20:40:26.824 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.824+0000 7fe9d1b97d40 1 bluefs _init_alloc shared, id 1, capacity 0x1680000000, block size 0x10000 2026-03-31T20:40:26.824 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.824+0000 7fe9d1b97d40 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.824 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.824+0000 7fe9d1b97d40 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.824 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.824+0000 7fe9d1b97d40 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.824 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.824+0000 7fe9d1b97d40 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.824 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.824+0000 7fe9d1b97d40 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.824 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.824+0000 7fe9d1b97d40 1 bluefs mount shared_bdev_used = 27262976 2026-03-31T20:40:26.824 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.824+0000 7fe9d1b97d40 1 bluestore(./cephtool.bHN) _prepare_db_environment set db_paths to db,91804925952 db.slow,91804925952 2026-03-31T20:40:26.836 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.836+0000 7fe9d1b97d40 1 bluestore(./cephtool.bHN) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-31T20:40:26.838 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.836+0000 7fe9d1b97d40 1 bluefs umount 2026-03-31T20:40:26.838 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.836+0000 7fe9d1b97d40 1 bdev(0x5601852c1c00 ./cephtool.bHN/block) close 2026-03-31T20:40:26.867 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.868+0000 7fe9d1b97d40 1 freelist shutdown 2026-03-31T20:40:26.868 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.868+0000 7fe9d1b97d40 1 bdev(0x5601852c1800 ./cephtool.bHN/block) close 2026-03-31T20:40:26.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:57: test_ceph_kvstore_tool: expect_false ceph-kvstore-tool bluestore-kv ./cephtool.bHN exists TESTPREFIX TESTKEY 2026-03-31T20:40:26.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:17: expect_false: set -x 2026-03-31T20:40:26.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:18: expect_false: ceph-kvstore-tool bluestore-kv ./cephtool.bHN exists TESTPREFIX TESTKEY 2026-03-31T20:40:26.919 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.916+0000 7fb7b8772d40 1 bdev(0x558acbc8d800 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:26.919 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.916+0000 7fb7b8772d40 1 bdev(0x558acbc8d800 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:26.919 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.916+0000 7fb7b8772d40 1 bdev(0x558acbc8d800 ./cephtool.bHN/block) close 2026-03-31T20:40:26.951 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.952+0000 7fb7b8772d40 1 bdev(0x558acbc8d800 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:26.952 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.952+0000 7fb7b8772d40 1 bdev(0x558acbc8d800 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:26.952 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.952+0000 7fb7b8772d40 1 bluestore(./cephtool.bHN) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06 2026-03-31T20:40:26.952 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.952+0000 7fb7b8772d40 1 bdev(0x558acbc8dc00 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:26.952 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.952+0000 7fb7b8772d40 1 bdev(0x558acbc8dc00 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:26.952 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.952+0000 7fb7b8772d40 1 bluefs add_block_device bdev 1 path ./cephtool.bHN/block size 90 GiB 2026-03-31T20:40:26.952 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.952+0000 7fb7b8772d40 1 bluefs mount 2026-03-31T20:40:26.952 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.952+0000 7fb7b8772d40 1 bluefs _init_alloc shared, id 1, capacity 0x1680000000, block size 0x10000 2026-03-31T20:40:26.953 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.952+0000 7fb7b8772d40 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.953 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.952+0000 7fb7b8772d40 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.954 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.952+0000 7fb7b8772d40 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.954 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.952+0000 7fb7b8772d40 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.954 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.952+0000 7fb7b8772d40 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:26.954 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.952+0000 7fb7b8772d40 1 bluefs mount shared_bdev_used = 0 2026-03-31T20:40:26.954 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.952+0000 7fb7b8772d40 1 bluestore(./cephtool.bHN) _prepare_db_environment set db_paths to db,91804925952 db.slow,91804925952 2026-03-31T20:40:26.959 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.956+0000 7fb7b8772d40 1 bluestore(./cephtool.bHN) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-31T20:40:26.959 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.956+0000 7fb7b8772d40 1 bluestore(./cephtool.bHN) _open_super_meta old nid_max 0 2026-03-31T20:40:26.959 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.956+0000 7fb7b8772d40 1 bluestore(./cephtool.bHN) _open_super_meta old blobid_max 0 2026-03-31T20:40:26.959 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.956+0000 7fb7b8772d40 1 bluestore(./cephtool.bHN) _open_super_meta ondisk_format 4 compat_ondisk_format 3 2026-03-31T20:40:26.959 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.956+0000 7fb7b8772d40 1 bluestore(./cephtool.bHN) _open_super_meta min_alloc_size 0x1000 2026-03-31T20:40:26.960 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.960+0000 7fb7b8772d40 1 freelist init 2026-03-31T20:40:26.960 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.960+0000 7fb7b8772d40 1 freelist _read_cfg 2026-03-31T20:40:26.960 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.960+0000 7fb7b8772d40 1 bluestore(./cephtool.bHN) _open_fm effective freelist_type = bitmap, freelist_alloc_size = 0x1000, min_alloc_size = 0x1000 2026-03-31T20:40:26.960 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.960+0000 7fb7b8772d40 1 bluestore(./cephtool.bHN) _init_alloc loaded 90 GiB in 1 extents, allocator type hybrid, capacity 0x1680000000, block size 0x1000, free 0x167fffe000, fragmentation 0 2026-03-31T20:40:26.960 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.960+0000 7fb7b8772d40 1 bluefs umount 2026-03-31T20:40:26.960 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:26.960+0000 7fb7b8772d40 1 bdev(0x558acbc8dc00 ./cephtool.bHN/block) close 2026-03-31T20:40:27.004 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.004+0000 7fb7b8772d40 1 bdev(0x558acbc8dc00 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:27.004 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.004+0000 7fb7b8772d40 1 bdev(0x558acbc8dc00 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:27.004 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.004+0000 7fb7b8772d40 1 bluefs add_block_device bdev 1 path ./cephtool.bHN/block size 90 GiB 2026-03-31T20:40:27.004 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.004+0000 7fb7b8772d40 1 bluefs mount 2026-03-31T20:40:27.004 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.004+0000 7fb7b8772d40 1 bluefs _init_alloc shared, id 1, capacity 0x1680000000, block size 0x10000 2026-03-31T20:40:27.004 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.004+0000 7fb7b8772d40 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:27.004 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.004+0000 7fb7b8772d40 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:27.004 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.004+0000 7fb7b8772d40 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:27.004 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.004+0000 7fb7b8772d40 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:27.004 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.004+0000 7fb7b8772d40 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:27.004 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.004+0000 7fb7b8772d40 1 bluefs mount shared_bdev_used = 27328512 2026-03-31T20:40:27.004 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.004+0000 7fb7b8772d40 1 bluestore(./cephtool.bHN) _prepare_db_environment set db_paths to db,91804925952 db.slow,91804925952 2026-03-31T20:40:27.009 INFO:tasks.workunit.client.0.vm03.stdout:(TESTPREFIX, TESTKEY) does not exist 2026-03-31T20:40:27.009 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.008+0000 7fb7b8772d40 1 bluestore(./cephtool.bHN) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-31T20:40:27.009 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.008+0000 7fb7b8772d40 1 bluefs umount 2026-03-31T20:40:27.009 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.008+0000 7fb7b8772d40 1 bdev(0x558acbc8dc00 ./cephtool.bHN/block) close 2026-03-31T20:40:27.055 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.056+0000 7fb7b8772d40 1 freelist shutdown 2026-03-31T20:40:27.055 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.056+0000 7fb7b8772d40 1 bdev(0x558acbc8d800 ./cephtool.bHN/block) close 2026-03-31T20:40:27.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:18: expect_false: return 0 2026-03-31T20:40:27.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:60: test_ceph_kvstore_tool: ceph-kvstore-tool bluestore-kv ./cephtool.bHN compact 2026-03-31T20:40:27.107 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.104+0000 7f4dce39ed40 1 bdev(0x55d8fb2b7800 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:27.107 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.104+0000 7f4dce39ed40 1 bdev(0x55d8fb2b7800 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:27.107 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.104+0000 7f4dce39ed40 1 bdev(0x55d8fb2b7800 ./cephtool.bHN/block) close 2026-03-31T20:40:27.147 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.148+0000 7f4dce39ed40 1 bdev(0x55d8fb2b7800 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:27.148 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.148+0000 7f4dce39ed40 1 bdev(0x55d8fb2b7800 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:27.148 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.148+0000 7f4dce39ed40 1 bluestore(./cephtool.bHN) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06 2026-03-31T20:40:27.148 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.148+0000 7f4dce39ed40 1 bdev(0x55d8fb2b7c00 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:27.148 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.148+0000 7f4dce39ed40 1 bdev(0x55d8fb2b7c00 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:27.148 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.148+0000 7f4dce39ed40 1 bluefs add_block_device bdev 1 path ./cephtool.bHN/block size 90 GiB 2026-03-31T20:40:27.148 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.148+0000 7f4dce39ed40 1 bluefs mount 2026-03-31T20:40:27.148 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.148+0000 7f4dce39ed40 1 bluefs _init_alloc shared, id 1, capacity 0x1680000000, block size 0x10000 2026-03-31T20:40:27.150 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.148+0000 7f4dce39ed40 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:27.150 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.148+0000 7f4dce39ed40 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:27.150 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.148+0000 7f4dce39ed40 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:27.150 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.148+0000 7f4dce39ed40 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:27.150 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.148+0000 7f4dce39ed40 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:27.150 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.148+0000 7f4dce39ed40 1 bluefs mount shared_bdev_used = 0 2026-03-31T20:40:27.150 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.148+0000 7f4dce39ed40 1 bluestore(./cephtool.bHN) _prepare_db_environment set db_paths to db,91804925952 db.slow,91804925952 2026-03-31T20:40:27.155 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.152+0000 7f4dce39ed40 1 bluestore(./cephtool.bHN) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-31T20:40:27.155 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.152+0000 7f4dce39ed40 1 bluestore(./cephtool.bHN) _open_super_meta old nid_max 0 2026-03-31T20:40:27.155 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.152+0000 7f4dce39ed40 1 bluestore(./cephtool.bHN) _open_super_meta old blobid_max 0 2026-03-31T20:40:27.155 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.152+0000 7f4dce39ed40 1 bluestore(./cephtool.bHN) _open_super_meta ondisk_format 4 compat_ondisk_format 3 2026-03-31T20:40:27.155 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.152+0000 7f4dce39ed40 1 bluestore(./cephtool.bHN) _open_super_meta min_alloc_size 0x1000 2026-03-31T20:40:27.155 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.152+0000 7f4dce39ed40 1 freelist init 2026-03-31T20:40:27.155 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.152+0000 7f4dce39ed40 1 freelist _read_cfg 2026-03-31T20:40:27.155 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.152+0000 7f4dce39ed40 1 bluestore(./cephtool.bHN) _open_fm effective freelist_type = bitmap, freelist_alloc_size = 0x1000, min_alloc_size = 0x1000 2026-03-31T20:40:27.155 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.152+0000 7f4dce39ed40 1 bluestore(./cephtool.bHN) _init_alloc loaded 90 GiB in 1 extents, allocator type hybrid, capacity 0x1680000000, block size 0x1000, free 0x167fffe000, fragmentation 0 2026-03-31T20:40:27.155 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.152+0000 7f4dce39ed40 1 bluefs umount 2026-03-31T20:40:27.155 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.156+0000 7f4dce39ed40 1 bdev(0x55d8fb2b7c00 ./cephtool.bHN/block) close 2026-03-31T20:40:27.187 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.188+0000 7f4dce39ed40 1 bdev(0x55d8fb2b7c00 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:27.188 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.188+0000 7f4dce39ed40 1 bdev(0x55d8fb2b7c00 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:27.188 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.188+0000 7f4dce39ed40 1 bluefs add_block_device bdev 1 path ./cephtool.bHN/block size 90 GiB 2026-03-31T20:40:27.188 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.188+0000 7f4dce39ed40 1 bluefs mount 2026-03-31T20:40:27.188 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.188+0000 7f4dce39ed40 1 bluefs _init_alloc shared, id 1, capacity 0x1680000000, block size 0x10000 2026-03-31T20:40:27.188 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.188+0000 7f4dce39ed40 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:27.188 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.188+0000 7f4dce39ed40 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:27.188 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.188+0000 7f4dce39ed40 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:27.188 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.188+0000 7f4dce39ed40 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:27.188 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.188+0000 7f4dce39ed40 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:27.188 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.188+0000 7f4dce39ed40 1 bluefs mount shared_bdev_used = 27328512 2026-03-31T20:40:27.188 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.188+0000 7f4dce39ed40 1 bluestore(./cephtool.bHN) _prepare_db_environment set db_paths to db,91804925952 db.slow,91804925952 2026-03-31T20:40:27.201 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.200+0000 7f4dce39ed40 1 bluestore(./cephtool.bHN) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-31T20:40:27.201 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.200+0000 7f4dce39ed40 2 rocksdb: compact starting 2026-03-31T20:40:27.204 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.204+0000 7f4dce39ed40 2 rocksdb: compact completed 2026-03-31T20:40:27.205 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.204+0000 7f4dce39ed40 1 bluefs umount 2026-03-31T20:40:27.205 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.204+0000 7f4dce39ed40 1 bdev(0x55d8fb2b7c00 ./cephtool.bHN/block) close 2026-03-31T20:40:27.243 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.244+0000 7f4dce39ed40 1 freelist shutdown 2026-03-31T20:40:27.243 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.244+0000 7f4dce39ed40 1 bdev(0x55d8fb2b7800 ./cephtool.bHN/block) close 2026-03-31T20:40:27.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:63: test_ceph_kvstore_tool: ceph-kvstore-tool bluestore-kv ./cephtool.bHN destructive-repair 2026-03-31T20:40:27.295 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.292+0000 7f9c3310dd40 1 bdev(0x5582866d7800 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:27.295 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.292+0000 7f9c3310dd40 1 bdev(0x5582866d7800 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:27.295 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.292+0000 7f9c3310dd40 1 bdev(0x5582866d7800 ./cephtool.bHN/block) close 2026-03-31T20:40:27.331 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.332+0000 7f9c3310dd40 1 bdev(0x5582866d7800 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:27.332 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.332+0000 7f9c3310dd40 1 bdev(0x5582866d7800 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:27.332 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.332+0000 7f9c3310dd40 1 bluestore(./cephtool.bHN) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06 2026-03-31T20:40:27.332 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.332+0000 7f9c3310dd40 1 bdev(0x5582866d7c00 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:27.332 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.332+0000 7f9c3310dd40 1 bdev(0x5582866d7c00 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:27.332 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.332+0000 7f9c3310dd40 1 bluefs add_block_device bdev 1 path ./cephtool.bHN/block size 90 GiB 2026-03-31T20:40:27.332 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.332+0000 7f9c3310dd40 1 bluefs mount 2026-03-31T20:40:27.332 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.332+0000 7f9c3310dd40 1 bluefs _init_alloc shared, id 1, capacity 0x1680000000, block size 0x10000 2026-03-31T20:40:27.334 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.332+0000 7f9c3310dd40 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:27.334 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.332+0000 7f9c3310dd40 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:27.334 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.332+0000 7f9c3310dd40 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:27.334 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.332+0000 7f9c3310dd40 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:27.334 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.332+0000 7f9c3310dd40 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:27.334 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.332+0000 7f9c3310dd40 1 bluefs mount shared_bdev_used = 0 2026-03-31T20:40:27.334 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.332+0000 7f9c3310dd40 1 bluestore(./cephtool.bHN) _prepare_db_environment set db_paths to db,91804925952 db.slow,91804925952 2026-03-31T20:40:27.339 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.336+0000 7f9c3310dd40 1 bluestore(./cephtool.bHN) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-31T20:40:27.339 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.336+0000 7f9c3310dd40 1 bluestore(./cephtool.bHN) _open_super_meta old nid_max 0 2026-03-31T20:40:27.339 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.336+0000 7f9c3310dd40 1 bluestore(./cephtool.bHN) _open_super_meta old blobid_max 0 2026-03-31T20:40:27.339 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.336+0000 7f9c3310dd40 1 bluestore(./cephtool.bHN) _open_super_meta ondisk_format 4 compat_ondisk_format 3 2026-03-31T20:40:27.339 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.336+0000 7f9c3310dd40 1 bluestore(./cephtool.bHN) _open_super_meta min_alloc_size 0x1000 2026-03-31T20:40:27.339 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.336+0000 7f9c3310dd40 1 freelist init 2026-03-31T20:40:27.339 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.336+0000 7f9c3310dd40 1 freelist _read_cfg 2026-03-31T20:40:27.339 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.336+0000 7f9c3310dd40 1 bluestore(./cephtool.bHN) _open_fm effective freelist_type = bitmap, freelist_alloc_size = 0x1000, min_alloc_size = 0x1000 2026-03-31T20:40:27.339 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.336+0000 7f9c3310dd40 1 bluestore(./cephtool.bHN) _init_alloc loaded 90 GiB in 1 extents, allocator type hybrid, capacity 0x1680000000, block size 0x1000, free 0x167fffe000, fragmentation 0 2026-03-31T20:40:27.339 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.336+0000 7f9c3310dd40 1 bluefs umount 2026-03-31T20:40:27.339 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.336+0000 7f9c3310dd40 1 bdev(0x5582866d7c00 ./cephtool.bHN/block) close 2026-03-31T20:40:27.371 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.372+0000 7f9c3310dd40 1 bdev(0x5582866d7c00 ./cephtool.bHN/block) open path ./cephtool.bHN/block 2026-03-31T20:40:27.372 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.372+0000 7f9c3310dd40 1 bdev(0x5582866d7c00 ./cephtool.bHN/block) open size 96636764160 (0x1680000000, 90 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-31T20:40:27.372 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.372+0000 7f9c3310dd40 1 bluefs add_block_device bdev 1 path ./cephtool.bHN/block size 90 GiB 2026-03-31T20:40:27.372 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.372+0000 7f9c3310dd40 1 bluefs mount 2026-03-31T20:40:27.372 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.372+0000 7f9c3310dd40 1 bluefs _init_alloc shared, id 1, capacity 0x1680000000, block size 0x10000 2026-03-31T20:40:27.372 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.372+0000 7f9c3310dd40 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:27.372 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.372+0000 7f9c3310dd40 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:27.372 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.372+0000 7f9c3310dd40 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:27.373 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.372+0000 7f9c3310dd40 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:27.373 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.372+0000 7f9c3310dd40 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-31T20:40:27.373 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.372+0000 7f9c3310dd40 1 bluefs mount shared_bdev_used = 8781824 2026-03-31T20:40:27.373 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.372+0000 7f9c3310dd40 1 bluestore(./cephtool.bHN) _prepare_db_environment set db_paths to db,91804925952 db.slow,91804925952 2026-03-31T20:40:27.423 INFO:tasks.workunit.client.0.vm03.stdout:destructive-repair completed without reporting an error 2026-03-31T20:40:27.423 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.424+0000 7f9c3310dd40 1 bluefs umount 2026-03-31T20:40:27.423 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.424+0000 7f9c3310dd40 1 bdev(0x5582866d7c00 ./cephtool.bHN/block) close 2026-03-31T20:40:27.459 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.460+0000 7f9c3310dd40 1 freelist shutdown 2026-03-31T20:40:27.459 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-31T20:40:27.460+0000 7f9c3310dd40 1 bdev(0x5582866d7800 ./cephtool.bHN/block) close 2026-03-31T20:40:27.499 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:65: test_ceph_kvstore_tool: ceph-kvstore-tool bluestore-kv ./cephtool.bHN list 2026-03-31T20:40:27.499 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:65: test_ceph_kvstore_tool: wc -l 2026-03-31T20:40:27.686 INFO:tasks.workunit.client.0.vm03.stdout:OK 2026-03-31T20:40:27.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:65: test_ceph_kvstore_tool: current_kv_nums=12 2026-03-31T20:40:27.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:66: test_ceph_kvstore_tool: test 12 -eq 12 2026-03-31T20:40:27.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:71: : echo OK 2026-03-31T20:40:27.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/workunits/cephtool/test_kvstore_tool.sh:1: : rm -fr ./cephtool.bHN 2026-03-31T20:40:27.688 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-31T20:40:27.688 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-31T20:40:27.739 INFO:tasks.workunit:Stopping ['cephtool'] on client.0... 2026-03-31T20:40:27.739 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0 2026-03-31T20:40:28.118 DEBUG:teuthology.parallel:result is None 2026-03-31T20:40:28.118 DEBUG:teuthology.orchestra.run.vm03:> rm -rf /home/ubuntu/cephtest/clone.client.0 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.0 && cd /home/ubuntu/cephtest/clone.client.0 && git checkout 0392f78529848ec72469e8e431875cb98d3a5fb4 2026-03-31T20:40:28.122 INFO:tasks.workunit.client.0.vm03.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.0'... 2026-03-31T20:41:07.528 INFO:tasks.workunit.client.0.vm03.stderr:Note: switching to '0392f78529848ec72469e8e431875cb98d3a5fb4'. 2026-03-31T20:41:07.528 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:41:07.528 INFO:tasks.workunit.client.0.vm03.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-31T20:41:07.528 INFO:tasks.workunit.client.0.vm03.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-31T20:41:07.528 INFO:tasks.workunit.client.0.vm03.stderr:state without impacting any branches by switching back to a branch. 2026-03-31T20:41:07.528 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:41:07.528 INFO:tasks.workunit.client.0.vm03.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-31T20:41:07.528 INFO:tasks.workunit.client.0.vm03.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-31T20:41:07.528 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:41:07.528 INFO:tasks.workunit.client.0.vm03.stderr: git switch -c 2026-03-31T20:41:07.528 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:41:07.528 INFO:tasks.workunit.client.0.vm03.stderr:Or undo this operation with: 2026-03-31T20:41:07.528 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:41:07.528 INFO:tasks.workunit.client.0.vm03.stderr: git switch - 2026-03-31T20:41:07.528 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:41:07.528 INFO:tasks.workunit.client.0.vm03.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-31T20:41:07.528 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-31T20:41:07.528 INFO:tasks.workunit.client.0.vm03.stderr:HEAD is now at 0392f785298 qa/tasks/keystone: restart mariadb for rocky and alma linux too 2026-03-31T20:41:07.536 DEBUG:teuthology.orchestra.run.vm03:> cd -- /home/ubuntu/cephtest/clone.client.0/qa/workunits && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.0 2026-03-31T20:41:07.582 INFO:tasks.workunit.client.0.vm03.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2026-03-31T20:41:07.584 INFO:tasks.workunit.client.0.vm03.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-31T20:41:07.584 INFO:tasks.workunit.client.0.vm03.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2026-03-31T20:41:07.614 INFO:tasks.workunit.client.0.vm03.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2026-03-31T20:41:07.642 INFO:tasks.workunit.client.0.vm03.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2026-03-31T20:41:07.666 INFO:tasks.workunit.client.0.vm03.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-31T20:41:07.667 INFO:tasks.workunit.client.0.vm03.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-31T20:41:07.667 INFO:tasks.workunit.client.0.vm03.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2026-03-31T20:41:07.689 INFO:tasks.workunit.client.0.vm03.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-31T20:41:07.692 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-31T20:41:07.692 DEBUG:teuthology.orchestra.run.vm03:> dd if=/home/ubuntu/cephtest/workunits.list.client.0 of=/dev/stdout 2026-03-31T20:41:07.742 INFO:tasks.workunit:Running workunits matching mon/pool_ops.sh on client.0... 2026-03-31T20:41:07.743 INFO:tasks.workunit:Running workunit mon/pool_ops.sh... 2026-03-31T20:41:07.743 DEBUG:teuthology.orchestra.run.vm03:workunit test mon/pool_ops.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=0392f78529848ec72469e8e431875cb98d3a5fb4 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/mon/pool_ops.sh 2026-03-31T20:41:07.789 INFO:tasks.workunit.client.0.vm03.stderr:+ TEST_POOL=testpool1234 2026-03-31T20:41:07.789 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool create testpool1234 8 --autoscale-mode off 2026-03-31T20:41:08.887 INFO:tasks.workunit.client.0.vm03.stderr:pool 'testpool1234' already exists 2026-03-31T20:41:08.899 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool set testpool1234 pg_num_min 2 2026-03-31T20:41:10.846 INFO:tasks.workunit.client.0.vm03.stderr:set pool 52 pg_num_min to 2 2026-03-31T20:41:10.860 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool get testpool1234 pg_num_min 2026-03-31T20:41:10.860 INFO:tasks.workunit.client.0.vm03.stderr:+ grep 2 2026-03-31T20:41:11.061 INFO:tasks.workunit.client.0.vm03.stdout:pg_num_min: 2 2026-03-31T20:41:11.062 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool set testpool1234 pg_num_max 33 2026-03-31T20:41:12.856 INFO:tasks.workunit.client.0.vm03.stderr:set pool 52 pg_num_max to 33 2026-03-31T20:41:12.875 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool get testpool1234 pg_num_max 2026-03-31T20:41:12.876 INFO:tasks.workunit.client.0.vm03.stderr:+ grep 33 2026-03-31T20:41:13.077 INFO:tasks.workunit.client.0.vm03.stdout:pg_num_max: 33 2026-03-31T20:41:13.077 INFO:tasks.workunit.client.0.vm03.stderr:+ expect_false ceph osd pool set testpool1234 pg_num_min 9 2026-03-31T20:41:13.077 INFO:tasks.workunit.client.0.vm03.stderr:+ set -x 2026-03-31T20:41:13.077 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool set testpool1234 pg_num_min 9 2026-03-31T20:41:13.227 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: specified pg_num_min 9 > pg_num 8 2026-03-31T20:41:13.232 INFO:tasks.workunit.client.0.vm03.stderr:+ return 0 2026-03-31T20:41:13.232 INFO:tasks.workunit.client.0.vm03.stderr:+ expect_false ceph osd pool set testpool1234 pg_num_max 7 2026-03-31T20:41:13.232 INFO:tasks.workunit.client.0.vm03.stderr:+ set -x 2026-03-31T20:41:13.232 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool set testpool1234 pg_num_max 7 2026-03-31T20:41:13.380 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: specified pg_num_max 7 < pg_num 8 2026-03-31T20:41:13.383 INFO:tasks.workunit.client.0.vm03.stderr:+ return 0 2026-03-31T20:41:13.383 INFO:tasks.workunit.client.0.vm03.stderr:+ expect_false ceph osd pool set testpool1234 pg_num 1 2026-03-31T20:41:13.384 INFO:tasks.workunit.client.0.vm03.stderr:+ set -x 2026-03-31T20:41:13.384 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool set testpool1234 pg_num 1 2026-03-31T20:41:13.527 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: specified pg_num 1 < pg_num_min 2 2026-03-31T20:41:13.531 INFO:tasks.workunit.client.0.vm03.stderr:+ return 0 2026-03-31T20:41:13.531 INFO:tasks.workunit.client.0.vm03.stderr:+ expect_false ceph osd pool set testpool1234 pg_num 44 2026-03-31T20:41:13.531 INFO:tasks.workunit.client.0.vm03.stderr:+ set -x 2026-03-31T20:41:13.531 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool set testpool1234 pg_num 44 2026-03-31T20:41:13.674 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: specified pg_num 44 < pg_num_max 33 2026-03-31T20:41:13.678 INFO:tasks.workunit.client.0.vm03.stderr:+ return 0 2026-03-31T20:41:13.678 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool set testpool1234 pg_num_min 0 2026-03-31T20:41:14.879 INFO:tasks.workunit.client.0.vm03.stderr:set pool 52 pg_num_min to 0 2026-03-31T20:41:14.891 INFO:tasks.workunit.client.0.vm03.stderr:+ expect_false ceph osd pool get testpool1234 pg_num_min 2026-03-31T20:41:14.891 INFO:tasks.workunit.client.0.vm03.stderr:+ set -x 2026-03-31T20:41:14.891 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool get testpool1234 pg_num_min 2026-03-31T20:41:15.036 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'pg_num_min' is not set on pool 'testpool1234' 2026-03-31T20:41:15.039 INFO:tasks.workunit.client.0.vm03.stderr:+ return 0 2026-03-31T20:41:15.039 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool set testpool1234 pg_num_max 0 2026-03-31T20:41:16.882 INFO:tasks.workunit.client.0.vm03.stderr:set pool 52 pg_num_max to 0 2026-03-31T20:41:16.897 INFO:tasks.workunit.client.0.vm03.stderr:+ expect_false ceph osd pool get testpool1234 pg_num_max 2026-03-31T20:41:16.898 INFO:tasks.workunit.client.0.vm03.stderr:+ set -x 2026-03-31T20:41:16.898 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool get testpool1234 pg_num_max 2026-03-31T20:41:17.043 INFO:tasks.workunit.client.0.vm03.stderr:Error ENOENT: option 'pg_num_max' is not set on pool 'testpool1234' 2026-03-31T20:41:17.047 INFO:tasks.workunit.client.0.vm03.stderr:+ return 0 2026-03-31T20:41:17.047 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool delete testpool1234 testpool1234 --yes-i-really-really-mean-it 2026-03-31T20:41:17.962 INFO:tasks.workunit.client.0.vm03.stderr:pool 'testpool1234' does not exist 2026-03-31T20:41:17.974 INFO:tasks.workunit.client.0.vm03.stderr:+ expect_false ceph osd pool create foo 123 123 invalid foo-profile foo-rule 2026-03-31T20:41:17.974 INFO:tasks.workunit.client.0.vm03.stderr:+ set -x 2026-03-31T20:41:17.974 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool create foo 123 123 invalid foo-profile foo-rule 2026-03-31T20:41:18.121 INFO:tasks.workunit.client.0.vm03.stderr:foo-rule not valid: foo-rule not one of 'true', 'false' 2026-03-31T20:41:18.121 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: unused arguments: ['foo-rule'] 2026-03-31T20:41:18.121 INFO:tasks.workunit.client.0.vm03.stderr:osd pool create [] [] [] [] [] [] [] [] [] [] [--bulk] [] [] [--yes-i-really-mean-it] : create pool 2026-03-31T20:41:18.121 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:41:18.124 INFO:tasks.workunit.client.0.vm03.stderr:+ return 0 2026-03-31T20:41:18.124 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool create foo 123 123 replicated 2026-03-31T20:41:18.984 INFO:tasks.workunit.client.0.vm03.stderr:pool 'foo' already exists 2026-03-31T20:41:18.996 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool create fooo 123 123 erasure default 2026-03-31T20:41:20.022 INFO:tasks.workunit.client.0.vm03.stderr:pool 'fooo' already exists 2026-03-31T20:41:20.035 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool create foooo 123 2026-03-31T20:41:21.048 INFO:tasks.workunit.client.0.vm03.stderr:pool 'foooo' already exists 2026-03-31T20:41:21.060 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool create foo 123 2026-03-31T20:41:21.252 INFO:tasks.workunit.client.0.vm03.stderr:pool 'foo' already exists 2026-03-31T20:41:21.263 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool set foo size 1 --yes-i-really-mean-it 2026-03-31T20:41:22.961 INFO:tasks.workunit.client.0.vm03.stderr:set pool 53 size to 1 2026-03-31T20:41:22.985 INFO:tasks.workunit.client.0.vm03.stderr:+ expect_config_value foo min_size 1 2026-03-31T20:41:22.985 INFO:tasks.workunit.client.0.vm03.stderr:+ local pool_name config_opt expected_val val 2026-03-31T20:41:22.985 INFO:tasks.workunit.client.0.vm03.stderr:+ pool_name=foo 2026-03-31T20:41:22.985 INFO:tasks.workunit.client.0.vm03.stderr:+ config_opt=min_size 2026-03-31T20:41:22.985 INFO:tasks.workunit.client.0.vm03.stderr:+ expected_val=1 2026-03-31T20:41:22.985 INFO:tasks.workunit.client.0.vm03.stderr:++ get_config_value_or_die foo min_size 2026-03-31T20:41:22.985 INFO:tasks.workunit.client.0.vm03.stderr:++ local pool_name config_opt raw val 2026-03-31T20:41:22.985 INFO:tasks.workunit.client.0.vm03.stderr:++ pool_name=foo 2026-03-31T20:41:22.985 INFO:tasks.workunit.client.0.vm03.stderr:++ config_opt=min_size 2026-03-31T20:41:22.989 INFO:tasks.workunit.client.0.vm03.stderr:+++ ceph osd pool get foo min_size 2026-03-31T20:41:23.210 INFO:tasks.workunit.client.0.vm03.stderr:++ raw='min_size: 1' 2026-03-31T20:41:23.210 INFO:tasks.workunit.client.0.vm03.stderr:++ [[ 0 -ne 0 ]] 2026-03-31T20:41:23.210 INFO:tasks.workunit.client.0.vm03.stderr:+++ echo min_size: 1 2026-03-31T20:41:23.210 INFO:tasks.workunit.client.0.vm03.stderr:+++ sed -e 's/[{} "]//g' 2026-03-31T20:41:23.211 INFO:tasks.workunit.client.0.vm03.stderr:++ raw=min_size:1 2026-03-31T20:41:23.211 INFO:tasks.workunit.client.0.vm03.stderr:+++ echo min_size:1 2026-03-31T20:41:23.211 INFO:tasks.workunit.client.0.vm03.stderr:+++ cut -f2 -d: 2026-03-31T20:41:23.212 INFO:tasks.workunit.client.0.vm03.stderr:++ val=1 2026-03-31T20:41:23.212 INFO:tasks.workunit.client.0.vm03.stderr:++ echo 1 2026-03-31T20:41:23.212 INFO:tasks.workunit.client.0.vm03.stderr:++ return 0 2026-03-31T20:41:23.212 INFO:tasks.workunit.client.0.vm03.stderr:+ val=1 2026-03-31T20:41:23.212 INFO:tasks.workunit.client.0.vm03.stderr:+ [[ 1 != \1 ]] 2026-03-31T20:41:23.212 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool set foo size 4 2026-03-31T20:41:24.977 INFO:tasks.workunit.client.0.vm03.stderr:set pool 53 size to 4 2026-03-31T20:41:25.029 INFO:tasks.workunit.client.0.vm03.stderr:+ expect_config_value foo min_size 2 2026-03-31T20:41:25.030 INFO:tasks.workunit.client.0.vm03.stderr:+ local pool_name config_opt expected_val val 2026-03-31T20:41:25.030 INFO:tasks.workunit.client.0.vm03.stderr:+ pool_name=foo 2026-03-31T20:41:25.030 INFO:tasks.workunit.client.0.vm03.stderr:+ config_opt=min_size 2026-03-31T20:41:25.030 INFO:tasks.workunit.client.0.vm03.stderr:+ expected_val=2 2026-03-31T20:41:25.030 INFO:tasks.workunit.client.0.vm03.stderr:++ get_config_value_or_die foo min_size 2026-03-31T20:41:25.030 INFO:tasks.workunit.client.0.vm03.stderr:++ local pool_name config_opt raw val 2026-03-31T20:41:25.030 INFO:tasks.workunit.client.0.vm03.stderr:++ pool_name=foo 2026-03-31T20:41:25.030 INFO:tasks.workunit.client.0.vm03.stderr:++ config_opt=min_size 2026-03-31T20:41:25.030 INFO:tasks.workunit.client.0.vm03.stderr:+++ ceph osd pool get foo min_size 2026-03-31T20:41:25.245 INFO:tasks.workunit.client.0.vm03.stderr:++ raw='min_size: 2' 2026-03-31T20:41:25.246 INFO:tasks.workunit.client.0.vm03.stderr:++ [[ 0 -ne 0 ]] 2026-03-31T20:41:25.246 INFO:tasks.workunit.client.0.vm03.stderr:+++ echo min_size: 2 2026-03-31T20:41:25.246 INFO:tasks.workunit.client.0.vm03.stderr:+++ sed -e 's/[{} "]//g' 2026-03-31T20:41:25.247 INFO:tasks.workunit.client.0.vm03.stderr:++ raw=min_size:2 2026-03-31T20:41:25.247 INFO:tasks.workunit.client.0.vm03.stderr:+++ echo min_size:2 2026-03-31T20:41:25.247 INFO:tasks.workunit.client.0.vm03.stderr:+++ cut -f2 -d: 2026-03-31T20:41:25.248 INFO:tasks.workunit.client.0.vm03.stderr:++ val=2 2026-03-31T20:41:25.248 INFO:tasks.workunit.client.0.vm03.stderr:++ echo 2 2026-03-31T20:41:25.248 INFO:tasks.workunit.client.0.vm03.stderr:++ return 0 2026-03-31T20:41:25.248 INFO:tasks.workunit.client.0.vm03.stderr:+ val=2 2026-03-31T20:41:25.248 INFO:tasks.workunit.client.0.vm03.stderr:+ [[ 2 != \2 ]] 2026-03-31T20:41:25.248 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool set foo size 10 2026-03-31T20:41:27.008 INFO:tasks.workunit.client.0.vm03.stderr:set pool 53 size to 10 2026-03-31T20:41:27.029 INFO:tasks.workunit.client.0.vm03.stderr:+ expect_config_value foo min_size 5 2026-03-31T20:41:27.029 INFO:tasks.workunit.client.0.vm03.stderr:+ local pool_name config_opt expected_val val 2026-03-31T20:41:27.029 INFO:tasks.workunit.client.0.vm03.stderr:+ pool_name=foo 2026-03-31T20:41:27.029 INFO:tasks.workunit.client.0.vm03.stderr:+ config_opt=min_size 2026-03-31T20:41:27.029 INFO:tasks.workunit.client.0.vm03.stderr:+ expected_val=5 2026-03-31T20:41:27.029 INFO:tasks.workunit.client.0.vm03.stderr:++ get_config_value_or_die foo min_size 2026-03-31T20:41:27.030 INFO:tasks.workunit.client.0.vm03.stderr:++ local pool_name config_opt raw val 2026-03-31T20:41:27.030 INFO:tasks.workunit.client.0.vm03.stderr:++ pool_name=foo 2026-03-31T20:41:27.030 INFO:tasks.workunit.client.0.vm03.stderr:++ config_opt=min_size 2026-03-31T20:41:27.030 INFO:tasks.workunit.client.0.vm03.stderr:+++ ceph osd pool get foo min_size 2026-03-31T20:41:27.248 INFO:tasks.workunit.client.0.vm03.stderr:++ raw='min_size: 5' 2026-03-31T20:41:27.248 INFO:tasks.workunit.client.0.vm03.stderr:++ [[ 0 -ne 0 ]] 2026-03-31T20:41:27.248 INFO:tasks.workunit.client.0.vm03.stderr:+++ echo min_size: 5 2026-03-31T20:41:27.248 INFO:tasks.workunit.client.0.vm03.stderr:+++ sed -e 's/[{} "]//g' 2026-03-31T20:41:27.249 INFO:tasks.workunit.client.0.vm03.stderr:++ raw=min_size:5 2026-03-31T20:41:27.249 INFO:tasks.workunit.client.0.vm03.stderr:+++ echo min_size:5 2026-03-31T20:41:27.249 INFO:tasks.workunit.client.0.vm03.stderr:+++ cut -f2 -d: 2026-03-31T20:41:27.250 INFO:tasks.workunit.client.0.vm03.stderr:++ val=5 2026-03-31T20:41:27.250 INFO:tasks.workunit.client.0.vm03.stderr:++ echo 5 2026-03-31T20:41:27.250 INFO:tasks.workunit.client.0.vm03.stderr:++ return 0 2026-03-31T20:41:27.250 INFO:tasks.workunit.client.0.vm03.stderr:+ val=5 2026-03-31T20:41:27.250 INFO:tasks.workunit.client.0.vm03.stderr:+ [[ 5 != \5 ]] 2026-03-31T20:41:27.250 INFO:tasks.workunit.client.0.vm03.stderr:+ expect_false ceph osd pool set foo size 0 2026-03-31T20:41:27.250 INFO:tasks.workunit.client.0.vm03.stderr:+ set -x 2026-03-31T20:41:27.250 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool set foo size 0 2026-03-31T20:41:27.395 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: pool size must be between 1 and 10 2026-03-31T20:41:27.398 INFO:tasks.workunit.client.0.vm03.stderr:+ return 0 2026-03-31T20:41:27.398 INFO:tasks.workunit.client.0.vm03.stderr:+ expect_false ceph osd pool set foo size 20 2026-03-31T20:41:27.398 INFO:tasks.workunit.client.0.vm03.stderr:+ set -x 2026-03-31T20:41:27.398 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool set foo size 20 2026-03-31T20:41:27.543 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: pool size must be between 1 and 10 2026-03-31T20:41:27.546 INFO:tasks.workunit.client.0.vm03.stderr:+ return 0 2026-03-31T20:41:27.546 INFO:tasks.workunit.client.0.vm03.stderr:+ expect_false ceph osd pool delete foo 2026-03-31T20:41:27.546 INFO:tasks.workunit.client.0.vm03.stderr:+ set -x 2026-03-31T20:41:27.547 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool delete foo 2026-03-31T20:41:27.692 INFO:tasks.workunit.client.0.vm03.stderr:Error EPERM: WARNING: this will *PERMANENTLY DESTROY* all data stored in pool foo. If you are *ABSOLUTELY CERTAIN* that is what you want, pass the pool name *twice*, followed by --yes-i-really-really-mean-it. 2026-03-31T20:41:27.695 INFO:tasks.workunit.client.0.vm03.stderr:+ return 0 2026-03-31T20:41:27.695 INFO:tasks.workunit.client.0.vm03.stderr:+ expect_false ceph osd pool delete foo foo 2026-03-31T20:41:27.695 INFO:tasks.workunit.client.0.vm03.stderr:+ set -x 2026-03-31T20:41:27.695 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool delete foo foo 2026-03-31T20:41:27.843 INFO:tasks.workunit.client.0.vm03.stderr:Error EPERM: WARNING: this will *PERMANENTLY DESTROY* all data stored in pool foo. If you are *ABSOLUTELY CERTAIN* that is what you want, pass the pool name *twice*, followed by --yes-i-really-really-mean-it. 2026-03-31T20:41:27.846 INFO:tasks.workunit.client.0.vm03.stderr:+ return 0 2026-03-31T20:41:27.846 INFO:tasks.workunit.client.0.vm03.stderr:+ expect_false ceph osd pool delete foo foo --force 2026-03-31T20:41:27.846 INFO:tasks.workunit.client.0.vm03.stderr:+ set -x 2026-03-31T20:41:27.846 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool delete foo foo --force 2026-03-31T20:41:27.993 INFO:tasks.workunit.client.0.vm03.stderr:--force not valid: --force not one of 'true', 'false' 2026-03-31T20:41:27.993 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: unused arguments: ['--force'] 2026-03-31T20:41:27.993 INFO:tasks.workunit.client.0.vm03.stderr:osd pool delete [] [--yes-i-really-really-mean-it] [--yes-i-really-really-mean-it-not-faking] : delete pool 2026-03-31T20:41:27.993 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:41:27.995 INFO:tasks.workunit.client.0.vm03.stderr:+ return 0 2026-03-31T20:41:27.995 INFO:tasks.workunit.client.0.vm03.stderr:+ expect_false ceph osd pool delete foo fooo --yes-i-really-mean-it 2026-03-31T20:41:27.996 INFO:tasks.workunit.client.0.vm03.stderr:+ set -x 2026-03-31T20:41:27.996 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool delete foo fooo --yes-i-really-mean-it 2026-03-31T20:41:28.141 INFO:tasks.workunit.client.0.vm03.stderr:--yes-i-really-mean-it not valid: --yes-i-really-mean-it not one of 'true', 'false' 2026-03-31T20:41:28.141 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: unused arguments: ['--yes-i-really-mean-it'] 2026-03-31T20:41:28.141 INFO:tasks.workunit.client.0.vm03.stderr:osd pool delete [] [--yes-i-really-really-mean-it] [--yes-i-really-really-mean-it-not-faking] : delete pool 2026-03-31T20:41:28.141 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:41:28.144 INFO:tasks.workunit.client.0.vm03.stderr:+ return 0 2026-03-31T20:41:28.144 INFO:tasks.workunit.client.0.vm03.stderr:+ expect_false ceph osd pool delete foo --yes-i-really-mean-it foo 2026-03-31T20:41:28.144 INFO:tasks.workunit.client.0.vm03.stderr:+ set -x 2026-03-31T20:41:28.144 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool delete foo --yes-i-really-mean-it foo 2026-03-31T20:41:28.293 INFO:tasks.workunit.client.0.vm03.stderr:foo not valid: foo not one of 'true', 'false' 2026-03-31T20:41:28.293 INFO:tasks.workunit.client.0.vm03.stderr:Invalid command: unused arguments: ['foo'] 2026-03-31T20:41:28.293 INFO:tasks.workunit.client.0.vm03.stderr:osd pool delete [] [--yes-i-really-really-mean-it] [--yes-i-really-really-mean-it-not-faking] : delete pool 2026-03-31T20:41:28.293 INFO:tasks.workunit.client.0.vm03.stderr:Error EINVAL: invalid command 2026-03-31T20:41:28.295 INFO:tasks.workunit.client.0.vm03.stderr:+ return 0 2026-03-31T20:41:28.295 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool delete foooo foooo --yes-i-really-really-mean-it 2026-03-31T20:41:29.115 INFO:tasks.workunit.client.0.vm03.stderr:pool 'foooo' does not exist 2026-03-31T20:41:29.126 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool delete fooo fooo --yes-i-really-really-mean-it 2026-03-31T20:41:30.127 INFO:tasks.workunit.client.0.vm03.stderr:pool 'fooo' does not exist 2026-03-31T20:41:30.139 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool delete foo foo --yes-i-really-really-mean-it 2026-03-31T20:41:31.113 INFO:tasks.workunit.client.0.vm03.stderr:pool 'foo' does not exist 2026-03-31T20:41:31.124 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool delete foo foo --yes-i-really-really-mean-it 2026-03-31T20:41:31.314 INFO:tasks.workunit.client.0.vm03.stderr:pool 'foo' does not exist 2026-03-31T20:41:31.325 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool delete fooo fooo --yes-i-really-really-mean-it 2026-03-31T20:41:31.521 INFO:tasks.workunit.client.0.vm03.stderr:pool 'fooo' does not exist 2026-03-31T20:41:31.532 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool delete fooo fooo --yes-i-really-really-mean-it 2026-03-31T20:41:31.725 INFO:tasks.workunit.client.0.vm03.stderr:pool 'fooo' does not exist 2026-03-31T20:41:31.736 INFO:tasks.workunit.client.0.vm03.stderr:+ ceph osd pool delete fuggg fuggg --yes-i-really-really-mean-it 2026-03-31T20:41:31.930 INFO:tasks.workunit.client.0.vm03.stderr:pool 'fuggg' does not exist 2026-03-31T20:41:31.941 INFO:tasks.workunit.client.0.vm03.stdout:OK 2026-03-31T20:41:31.941 INFO:tasks.workunit.client.0.vm03.stderr:+ echo OK 2026-03-31T20:41:31.941 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-31T20:41:31.942 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-31T20:41:31.991 INFO:tasks.workunit:Stopping ['mon/pool_ops.sh'] on client.0... 2026-03-31T20:41:31.991 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0 2026-03-31T20:41:32.338 DEBUG:teuthology.parallel:result is None 2026-03-31T20:41:32.338 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0 2026-03-31T20:41:32.345 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.0/client.0 2026-03-31T20:41:32.345 DEBUG:teuthology.orchestra.run.vm03:> rmdir -- /home/ubuntu/cephtest/mnt.0 2026-03-31T20:41:32.391 INFO:tasks.workunit:Deleted artificial mount point /home/ubuntu/cephtest/mnt.0/client.0 2026-03-31T20:41:32.391 DEBUG:teuthology.run_tasks:Unwinding manager ceph 2026-03-31T20:41:32.392 INFO:tasks.ceph.ceph_manager.ceph:waiting for clean 2026-03-31T20:41:32.392 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-31T20:41:32.585 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:41:32.586 INFO:teuthology.orchestra.run.vm03.stderr:dumped all 2026-03-31T20:41:32.598 INFO:teuthology.orchestra.run.vm03.stdout:{"pg_ready":true,"pg_map":{"version":1151,"stamp":"2026-03-31T20:41:31.740068+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459299,"num_objects":4,"num_object_clones":0,"num_object_copies":8,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":4,"num_whiteouts":0,"num_read":106,"num_read_kb":213,"num_write":71,"num_write_kb":586,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":11,"num_bytes_recovered":1837158,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":46,"ondisk_log_size":46,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":814,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":94452,"kb_used_data":13700,"kb_used_omap":999,"kb_used_meta":79704,"kb_avail":283021068,"statfs":{"total":289910292480,"available":289813573632,"internally_reserved":0,"allocated":14028800,"data_stored":6857494,"data_compressed":143397,"data_compressed_allocated":884736,"data_compressed_original":1486166,"omap_allocated":1023333,"internal_metadata":81617563},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":5,"apply_latency_ms":5,"commit_latency_ns":5000000,"apply_latency_ns":5000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"4.742500"},"pg_stats":[{"pgid":"1.0","version":"172'41","reported_seq":1291,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.046268+0000","last_change":"2026-03-31T20:37:10.228861+0000","last_active":"2026-03-31T20:41:29.046268+0000","last_peered":"2026-03-31T20:41:29.046268+0000","last_clean":"2026-03-31T20:41:29.046268+0000","last_became_active":"2026-03-31T20:37:09.919785+0000","last_became_peered":"2026-03-31T20:37:09.919785+0000","last_unstale":"2026-03-31T20:41:29.046268+0000","last_undegraded":"2026-03-31T20:41:29.046268+0000","last_fullsized":"2026-03-31T20:41:29.046268+0000","mapping_epoch":709,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":710,"parent":"0.0","parent_split_bits":0,"last_scrub":"172'41","last_scrub_stamp":"2026-03-31T20:31:22.320510+0000","last_deep_scrub":"172'41","last_deep_scrub_stamp":"2026-03-31T20:31:21.304517+0000","last_clean_scrub_stamp":"2026-03-31T20:31:22.320510+0000","objects_scrubbed":2,"log_size":41,"log_dups_size":0,"ondisk_log_size":41,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-01T21:57:48.466412+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":106,"num_read_kb":213,"num_write":69,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":8,"num_bytes_recovered":1837120,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2],"acting":[0,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.3","version":"172'2","reported_seq":556,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.033656+0000","last_change":"2026-03-31T20:37:09.922257+0000","last_active":"2026-03-31T20:41:29.033656+0000","last_peered":"2026-03-31T20:41:29.033656+0000","last_clean":"2026-03-31T20:41:29.033656+0000","last_became_active":"2026-03-31T20:37:09.919661+0000","last_became_peered":"2026-03-31T20:37:09.919661+0000","last_unstale":"2026-03-31T20:41:29.033656+0000","last_undegraded":"2026-03-31T20:41:29.033656+0000","last_fullsized":"2026-03-31T20:41:29.033656+0000","mapping_epoch":709,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":710,"parent":"0.0","parent_split_bits":0,"last_scrub":"172'2","last_scrub_stamp":"2026-03-31T20:26:37.430905+0000","last_deep_scrub":"172'2","last_deep_scrub_stamp":"2026-03-31T20:26:31.466479+0000","last_clean_scrub_stamp":"2026-03-31T20:26:37.430905+0000","objects_scrubbed":1,"log_size":2,"log_dups_size":0,"ondisk_log_size":2,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T07:38:57.731689+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00057155900000000002,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":1,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,0],"acting":[2,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.0","version":"0'0","reported_seq":1651,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.034287+0000","last_change":"2026-03-31T20:37:09.919928+0000","last_active":"2026-03-31T20:41:29.034287+0000","last_peered":"2026-03-31T20:41:29.034287+0000","last_clean":"2026-03-31T20:41:29.034287+0000","last_became_active":"2026-03-31T20:37:09.919658+0000","last_became_peered":"2026-03-31T20:37:09.919658+0000","last_unstale":"2026-03-31T20:41:29.034287+0000","last_undegraded":"2026-03-31T20:41:29.034287+0000","last_fullsized":"2026-03-31T20:41:29.034287+0000","mapping_epoch":709,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":710,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:33.059473+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:31.028583+0000","last_clean_scrub_stamp":"2026-03-31T20:26:33.059473+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T00:03:43.041389+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00028135699999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,0],"acting":[2,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.1","version":"0'0","reported_seq":1645,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.032565+0000","last_change":"2026-03-31T20:26:34.047630+0000","last_active":"2026-03-31T20:41:29.032565+0000","last_peered":"2026-03-31T20:41:29.032565+0000","last_clean":"2026-03-31T20:41:29.032565+0000","last_became_active":"2026-03-31T20:21:33.036761+0000","last_became_peered":"2026-03-31T20:21:33.036761+0000","last_unstale":"2026-03-31T20:41:29.032565+0000","last_undegraded":"2026-03-31T20:41:29.032565+0000","last_fullsized":"2026-03-31T20:41:29.032565+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:34.047600+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:32.053167+0000","last_clean_scrub_stamp":"2026-03-31T20:26:34.047600+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T06:08:52.964865+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00033548699999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.2","version":"172'3","reported_seq":1570,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.047976+0000","last_change":"2026-03-31T20:29:18.927326+0000","last_active":"2026-03-31T20:41:29.047976+0000","last_peered":"2026-03-31T20:41:29.047976+0000","last_clean":"2026-03-31T20:41:29.047976+0000","last_became_active":"2026-03-31T20:29:18.927209+0000","last_became_peered":"2026-03-31T20:29:18.927209+0000","last_unstale":"2026-03-31T20:41:29.047976+0000","last_undegraded":"2026-03-31T20:41:29.047976+0000","last_fullsized":"2026-03-31T20:41:29.047976+0000","mapping_epoch":330,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":331,"parent":"0.0","parent_split_bits":0,"last_scrub":"172'3","last_scrub_stamp":"2026-03-31T20:26:31.889790+0000","last_deep_scrub":"172'3","last_deep_scrub_stamp":"2026-03-31T20:26:30.871302+0000","last_clean_scrub_stamp":"2026-03-31T20:26:31.889790+0000","objects_scrubbed":1,"log_size":3,"log_dups_size":0,"ondisk_log_size":3,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T01:50:50.550157+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00026154900000000003,"stat_sum":{"num_bytes":19,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":2,"num_bytes_recovered":38,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.4","version":"0'0","reported_seq":903,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.046320+0000","last_change":"2026-03-31T20:37:10.920859+0000","last_active":"2026-03-31T20:41:29.046320+0000","last_peered":"2026-03-31T20:41:29.046320+0000","last_clean":"2026-03-31T20:41:29.046320+0000","last_became_active":"2026-03-31T20:37:10.920712+0000","last_became_peered":"2026-03-31T20:37:10.920712+0000","last_unstale":"2026-03-31T20:41:29.046320+0000","last_undegraded":"2026-03-31T20:41:29.046320+0000","last_fullsized":"2026-03-31T20:41:29.046320+0000","mapping_epoch":710,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":711,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:39.414769+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:33.464380+0000","last_clean_scrub_stamp":"2026-03-31T20:26:39.414769+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T00:48:38.702799+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00049229099999999995,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2],"acting":[0,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.5","version":"0'0","reported_seq":1162,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.046003+0000","last_change":"2026-03-31T20:37:09.919126+0000","last_active":"2026-03-31T20:41:29.046003+0000","last_peered":"2026-03-31T20:41:29.046003+0000","last_clean":"2026-03-31T20:41:29.046003+0000","last_became_active":"2026-03-31T20:37:09.918910+0000","last_became_peered":"2026-03-31T20:37:09.918910+0000","last_unstale":"2026-03-31T20:41:29.046003+0000","last_undegraded":"2026-03-31T20:41:29.046003+0000","last_fullsized":"2026-03-31T20:41:29.046003+0000","mapping_epoch":709,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":710,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:38.458279+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:32.494594+0000","last_clean_scrub_stamp":"2026-03-31T20:26:38.458279+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T00:33:50.695920+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00042708900000000002,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2],"acting":[0,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.6","version":"0'0","reported_seq":1161,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.046113+0000","last_change":"2026-03-31T20:37:09.918790+0000","last_active":"2026-03-31T20:41:29.046113+0000","last_peered":"2026-03-31T20:41:29.046113+0000","last_clean":"2026-03-31T20:41:29.046113+0000","last_became_active":"2026-03-31T20:37:09.918690+0000","last_became_peered":"2026-03-31T20:37:09.918690+0000","last_unstale":"2026-03-31T20:41:29.046113+0000","last_undegraded":"2026-03-31T20:41:29.046113+0000","last_fullsized":"2026-03-31T20:41:29.046113+0000","mapping_epoch":709,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":710,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:41.404628+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:35.469461+0000","last_clean_scrub_stamp":"2026-03-31T20:26:41.404628+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T00:39:39.971833+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00038476,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2],"acting":[0,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.7","version":"0'0","reported_seq":1165,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.046047+0000","last_change":"2026-03-31T20:37:07.899092+0000","last_active":"2026-03-31T20:41:29.046047+0000","last_peered":"2026-03-31T20:41:29.046047+0000","last_clean":"2026-03-31T20:41:29.046047+0000","last_became_active":"2026-03-31T20:37:07.898875+0000","last_became_peered":"2026-03-31T20:37:07.898875+0000","last_unstale":"2026-03-31T20:41:29.046047+0000","last_undegraded":"2026-03-31T20:41:29.046047+0000","last_fullsized":"2026-03-31T20:41:29.046047+0000","mapping_epoch":707,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":708,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:40.426768+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:34.453503+0000","last_clean_scrub_stamp":"2026-03-31T20:26:40.426768+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T01:27:47.105587+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00037397899999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2],"acting":[0,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":3,"num_bytes_recovered":38,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":38,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":8405,"internal_metadata":0},"log_size":5,"ondisk_log_size":5,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":106,"num_read_kb":213,"num_write":69,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":8,"num_bytes_recovered":1837120,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":704512,"data_stored":1377840,"data_compressed":14356,"data_compressed_allocated":700416,"data_compressed_original":1373744,"omap_allocated":6380,"internal_metadata":0},"log_size":41,"ondisk_log_size":41,"up":2,"acting":2,"num_store_stats":3}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359738630,"num_pgs":364,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":31468,"kb_used_data":4636,"kb_used_omap":526,"kb_used_meta":26289,"kb_avail":94340372,"statfs":{"total":96636764160,"available":96604540928,"internally_reserved":0,"allocated":4747264,"data_stored":2437324,"data_compressed":49391,"data_compressed_allocated":372736,"data_compressed_original":648482,"omap_allocated":538982,"internal_metadata":26920602},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":2,"apply_latency_ms":2,"commit_latency_ns":2000000,"apply_latency_ns":2000000},"alerts":[]},{"osd":1,"up_from":8,"seq":34359738630,"num_pgs":93,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":31512,"kb_used_data":4424,"kb_used_omap":61,"kb_used_meta":27010,"kb_avail":94340328,"statfs":{"total":96636764160,"available":96604495872,"internally_reserved":0,"allocated":4530176,"data_stored":1982827,"data_compressed":44615,"data_compressed_allocated":139264,"data_compressed_original":189202,"omap_allocated":63487,"internal_metadata":27658241},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]},{"osd":0,"up_from":221,"seq":949187772678,"num_pgs":357,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":31472,"kb_used_data":4640,"kb_used_omap":411,"kb_used_meta":26405,"kb_avail":94340368,"statfs":{"total":96636764160,"available":96604536832,"internally_reserved":0,"allocated":4751360,"data_stored":2437343,"data_compressed":49391,"data_compressed_allocated":372736,"data_compressed_original":648482,"omap_allocated":420864,"internal_metadata":27038720},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":2,"apply_latency_ms":2,"commit_latency_ns":2000000,"apply_latency_ns":2000000},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":233472,"data_stored":459280,"data_compressed":4776,"data_compressed_allocated":233472,"data_compressed_original":459280,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":237568,"data_stored":459280,"data_compressed":4804,"data_compressed_allocated":233472,"data_compressed_original":455184,"omap_allocated":6380,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":233472,"data_stored":459280,"data_compressed":4776,"data_compressed_allocated":233472,"data_compressed_original":459280,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":3190,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":2503,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":2712,"internal_metadata":0}]}} 2026-03-31T20:41:32.598 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-31T20:41:32.750 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:41:32.750 INFO:teuthology.orchestra.run.vm03.stderr:dumped all 2026-03-31T20:41:32.762 INFO:teuthology.orchestra.run.vm03.stdout:{"pg_ready":true,"pg_map":{"version":1151,"stamp":"2026-03-31T20:41:31.740068+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459299,"num_objects":4,"num_object_clones":0,"num_object_copies":8,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":4,"num_whiteouts":0,"num_read":106,"num_read_kb":213,"num_write":71,"num_write_kb":586,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":11,"num_bytes_recovered":1837158,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":46,"ondisk_log_size":46,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":814,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":94452,"kb_used_data":13700,"kb_used_omap":999,"kb_used_meta":79704,"kb_avail":283021068,"statfs":{"total":289910292480,"available":289813573632,"internally_reserved":0,"allocated":14028800,"data_stored":6857494,"data_compressed":143397,"data_compressed_allocated":884736,"data_compressed_original":1486166,"omap_allocated":1023333,"internal_metadata":81617563},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":5,"apply_latency_ms":5,"commit_latency_ns":5000000,"apply_latency_ns":5000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"4.742500"},"pg_stats":[{"pgid":"1.0","version":"172'41","reported_seq":1291,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.046268+0000","last_change":"2026-03-31T20:37:10.228861+0000","last_active":"2026-03-31T20:41:29.046268+0000","last_peered":"2026-03-31T20:41:29.046268+0000","last_clean":"2026-03-31T20:41:29.046268+0000","last_became_active":"2026-03-31T20:37:09.919785+0000","last_became_peered":"2026-03-31T20:37:09.919785+0000","last_unstale":"2026-03-31T20:41:29.046268+0000","last_undegraded":"2026-03-31T20:41:29.046268+0000","last_fullsized":"2026-03-31T20:41:29.046268+0000","mapping_epoch":709,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":710,"parent":"0.0","parent_split_bits":0,"last_scrub":"172'41","last_scrub_stamp":"2026-03-31T20:31:22.320510+0000","last_deep_scrub":"172'41","last_deep_scrub_stamp":"2026-03-31T20:31:21.304517+0000","last_clean_scrub_stamp":"2026-03-31T20:31:22.320510+0000","objects_scrubbed":2,"log_size":41,"log_dups_size":0,"ondisk_log_size":41,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-01T21:57:48.466412+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":106,"num_read_kb":213,"num_write":69,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":8,"num_bytes_recovered":1837120,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2],"acting":[0,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.3","version":"172'2","reported_seq":556,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.033656+0000","last_change":"2026-03-31T20:37:09.922257+0000","last_active":"2026-03-31T20:41:29.033656+0000","last_peered":"2026-03-31T20:41:29.033656+0000","last_clean":"2026-03-31T20:41:29.033656+0000","last_became_active":"2026-03-31T20:37:09.919661+0000","last_became_peered":"2026-03-31T20:37:09.919661+0000","last_unstale":"2026-03-31T20:41:29.033656+0000","last_undegraded":"2026-03-31T20:41:29.033656+0000","last_fullsized":"2026-03-31T20:41:29.033656+0000","mapping_epoch":709,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":710,"parent":"0.0","parent_split_bits":0,"last_scrub":"172'2","last_scrub_stamp":"2026-03-31T20:26:37.430905+0000","last_deep_scrub":"172'2","last_deep_scrub_stamp":"2026-03-31T20:26:31.466479+0000","last_clean_scrub_stamp":"2026-03-31T20:26:37.430905+0000","objects_scrubbed":1,"log_size":2,"log_dups_size":0,"ondisk_log_size":2,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T07:38:57.731689+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00057155900000000002,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":1,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,0],"acting":[2,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.0","version":"0'0","reported_seq":1651,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.034287+0000","last_change":"2026-03-31T20:37:09.919928+0000","last_active":"2026-03-31T20:41:29.034287+0000","last_peered":"2026-03-31T20:41:29.034287+0000","last_clean":"2026-03-31T20:41:29.034287+0000","last_became_active":"2026-03-31T20:37:09.919658+0000","last_became_peered":"2026-03-31T20:37:09.919658+0000","last_unstale":"2026-03-31T20:41:29.034287+0000","last_undegraded":"2026-03-31T20:41:29.034287+0000","last_fullsized":"2026-03-31T20:41:29.034287+0000","mapping_epoch":709,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":710,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:33.059473+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:31.028583+0000","last_clean_scrub_stamp":"2026-03-31T20:26:33.059473+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T00:03:43.041389+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00028135699999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,0],"acting":[2,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.1","version":"0'0","reported_seq":1645,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.032565+0000","last_change":"2026-03-31T20:26:34.047630+0000","last_active":"2026-03-31T20:41:29.032565+0000","last_peered":"2026-03-31T20:41:29.032565+0000","last_clean":"2026-03-31T20:41:29.032565+0000","last_became_active":"2026-03-31T20:21:33.036761+0000","last_became_peered":"2026-03-31T20:21:33.036761+0000","last_unstale":"2026-03-31T20:41:29.032565+0000","last_undegraded":"2026-03-31T20:41:29.032565+0000","last_fullsized":"2026-03-31T20:41:29.032565+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:34.047600+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:32.053167+0000","last_clean_scrub_stamp":"2026-03-31T20:26:34.047600+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T06:08:52.964865+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00033548699999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.2","version":"172'3","reported_seq":1570,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.047976+0000","last_change":"2026-03-31T20:29:18.927326+0000","last_active":"2026-03-31T20:41:29.047976+0000","last_peered":"2026-03-31T20:41:29.047976+0000","last_clean":"2026-03-31T20:41:29.047976+0000","last_became_active":"2026-03-31T20:29:18.927209+0000","last_became_peered":"2026-03-31T20:29:18.927209+0000","last_unstale":"2026-03-31T20:41:29.047976+0000","last_undegraded":"2026-03-31T20:41:29.047976+0000","last_fullsized":"2026-03-31T20:41:29.047976+0000","mapping_epoch":330,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":331,"parent":"0.0","parent_split_bits":0,"last_scrub":"172'3","last_scrub_stamp":"2026-03-31T20:26:31.889790+0000","last_deep_scrub":"172'3","last_deep_scrub_stamp":"2026-03-31T20:26:30.871302+0000","last_clean_scrub_stamp":"2026-03-31T20:26:31.889790+0000","objects_scrubbed":1,"log_size":3,"log_dups_size":0,"ondisk_log_size":3,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T01:50:50.550157+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00026154900000000003,"stat_sum":{"num_bytes":19,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":2,"num_bytes_recovered":38,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.4","version":"0'0","reported_seq":903,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.046320+0000","last_change":"2026-03-31T20:37:10.920859+0000","last_active":"2026-03-31T20:41:29.046320+0000","last_peered":"2026-03-31T20:41:29.046320+0000","last_clean":"2026-03-31T20:41:29.046320+0000","last_became_active":"2026-03-31T20:37:10.920712+0000","last_became_peered":"2026-03-31T20:37:10.920712+0000","last_unstale":"2026-03-31T20:41:29.046320+0000","last_undegraded":"2026-03-31T20:41:29.046320+0000","last_fullsized":"2026-03-31T20:41:29.046320+0000","mapping_epoch":710,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":711,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:39.414769+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:33.464380+0000","last_clean_scrub_stamp":"2026-03-31T20:26:39.414769+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T00:48:38.702799+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00049229099999999995,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2],"acting":[0,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.5","version":"0'0","reported_seq":1162,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.046003+0000","last_change":"2026-03-31T20:37:09.919126+0000","last_active":"2026-03-31T20:41:29.046003+0000","last_peered":"2026-03-31T20:41:29.046003+0000","last_clean":"2026-03-31T20:41:29.046003+0000","last_became_active":"2026-03-31T20:37:09.918910+0000","last_became_peered":"2026-03-31T20:37:09.918910+0000","last_unstale":"2026-03-31T20:41:29.046003+0000","last_undegraded":"2026-03-31T20:41:29.046003+0000","last_fullsized":"2026-03-31T20:41:29.046003+0000","mapping_epoch":709,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":710,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:38.458279+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:32.494594+0000","last_clean_scrub_stamp":"2026-03-31T20:26:38.458279+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T00:33:50.695920+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00042708900000000002,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2],"acting":[0,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.6","version":"0'0","reported_seq":1161,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.046113+0000","last_change":"2026-03-31T20:37:09.918790+0000","last_active":"2026-03-31T20:41:29.046113+0000","last_peered":"2026-03-31T20:41:29.046113+0000","last_clean":"2026-03-31T20:41:29.046113+0000","last_became_active":"2026-03-31T20:37:09.918690+0000","last_became_peered":"2026-03-31T20:37:09.918690+0000","last_unstale":"2026-03-31T20:41:29.046113+0000","last_undegraded":"2026-03-31T20:41:29.046113+0000","last_fullsized":"2026-03-31T20:41:29.046113+0000","mapping_epoch":709,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":710,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:41.404628+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:35.469461+0000","last_clean_scrub_stamp":"2026-03-31T20:26:41.404628+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T00:39:39.971833+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00038476,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2],"acting":[0,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.7","version":"0'0","reported_seq":1165,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.046047+0000","last_change":"2026-03-31T20:37:07.899092+0000","last_active":"2026-03-31T20:41:29.046047+0000","last_peered":"2026-03-31T20:41:29.046047+0000","last_clean":"2026-03-31T20:41:29.046047+0000","last_became_active":"2026-03-31T20:37:07.898875+0000","last_became_peered":"2026-03-31T20:37:07.898875+0000","last_unstale":"2026-03-31T20:41:29.046047+0000","last_undegraded":"2026-03-31T20:41:29.046047+0000","last_fullsized":"2026-03-31T20:41:29.046047+0000","mapping_epoch":707,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":708,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:40.426768+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:34.453503+0000","last_clean_scrub_stamp":"2026-03-31T20:26:40.426768+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T01:27:47.105587+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00037397899999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2],"acting":[0,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":3,"num_bytes_recovered":38,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":38,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":8405,"internal_metadata":0},"log_size":5,"ondisk_log_size":5,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":106,"num_read_kb":213,"num_write":69,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":8,"num_bytes_recovered":1837120,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":704512,"data_stored":1377840,"data_compressed":14356,"data_compressed_allocated":700416,"data_compressed_original":1373744,"omap_allocated":6380,"internal_metadata":0},"log_size":41,"ondisk_log_size":41,"up":2,"acting":2,"num_store_stats":3}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359738630,"num_pgs":364,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":31468,"kb_used_data":4636,"kb_used_omap":526,"kb_used_meta":26289,"kb_avail":94340372,"statfs":{"total":96636764160,"available":96604540928,"internally_reserved":0,"allocated":4747264,"data_stored":2437324,"data_compressed":49391,"data_compressed_allocated":372736,"data_compressed_original":648482,"omap_allocated":538982,"internal_metadata":26920602},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":2,"apply_latency_ms":2,"commit_latency_ns":2000000,"apply_latency_ns":2000000},"alerts":[]},{"osd":1,"up_from":8,"seq":34359738630,"num_pgs":93,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":31512,"kb_used_data":4424,"kb_used_omap":61,"kb_used_meta":27010,"kb_avail":94340328,"statfs":{"total":96636764160,"available":96604495872,"internally_reserved":0,"allocated":4530176,"data_stored":1982827,"data_compressed":44615,"data_compressed_allocated":139264,"data_compressed_original":189202,"omap_allocated":63487,"internal_metadata":27658241},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]},{"osd":0,"up_from":221,"seq":949187772678,"num_pgs":357,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":31472,"kb_used_data":4640,"kb_used_omap":411,"kb_used_meta":26405,"kb_avail":94340368,"statfs":{"total":96636764160,"available":96604536832,"internally_reserved":0,"allocated":4751360,"data_stored":2437343,"data_compressed":49391,"data_compressed_allocated":372736,"data_compressed_original":648482,"omap_allocated":420864,"internal_metadata":27038720},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":2,"apply_latency_ms":2,"commit_latency_ns":2000000,"apply_latency_ns":2000000},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":233472,"data_stored":459280,"data_compressed":4776,"data_compressed_allocated":233472,"data_compressed_original":459280,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":237568,"data_stored":459280,"data_compressed":4804,"data_compressed_allocated":233472,"data_compressed_original":455184,"omap_allocated":6380,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":233472,"data_stored":459280,"data_compressed":4776,"data_compressed_allocated":233472,"data_compressed_original":459280,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":3190,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":2503,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":2712,"internal_metadata":0}]}} 2026-03-31T20:41:32.762 INFO:tasks.ceph.ceph_manager.ceph:clean! 2026-03-31T20:41:32.762 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-31T20:41:32.914 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:41:32.915 INFO:teuthology.orchestra.run.vm03.stderr:dumped all 2026-03-31T20:41:32.926 INFO:teuthology.orchestra.run.vm03.stdout:{"pg_ready":true,"pg_map":{"version":1151,"stamp":"2026-03-31T20:41:31.740068+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459299,"num_objects":4,"num_object_clones":0,"num_object_copies":8,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":4,"num_whiteouts":0,"num_read":106,"num_read_kb":213,"num_write":71,"num_write_kb":586,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":11,"num_bytes_recovered":1837158,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":46,"ondisk_log_size":46,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":814,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":94452,"kb_used_data":13700,"kb_used_omap":999,"kb_used_meta":79704,"kb_avail":283021068,"statfs":{"total":289910292480,"available":289813573632,"internally_reserved":0,"allocated":14028800,"data_stored":6857494,"data_compressed":143397,"data_compressed_allocated":884736,"data_compressed_original":1486166,"omap_allocated":1023333,"internal_metadata":81617563},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":5,"apply_latency_ms":5,"commit_latency_ns":5000000,"apply_latency_ns":5000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"4.742500"},"pg_stats":[{"pgid":"1.0","version":"172'41","reported_seq":1291,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.046268+0000","last_change":"2026-03-31T20:37:10.228861+0000","last_active":"2026-03-31T20:41:29.046268+0000","last_peered":"2026-03-31T20:41:29.046268+0000","last_clean":"2026-03-31T20:41:29.046268+0000","last_became_active":"2026-03-31T20:37:09.919785+0000","last_became_peered":"2026-03-31T20:37:09.919785+0000","last_unstale":"2026-03-31T20:41:29.046268+0000","last_undegraded":"2026-03-31T20:41:29.046268+0000","last_fullsized":"2026-03-31T20:41:29.046268+0000","mapping_epoch":709,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":710,"parent":"0.0","parent_split_bits":0,"last_scrub":"172'41","last_scrub_stamp":"2026-03-31T20:31:22.320510+0000","last_deep_scrub":"172'41","last_deep_scrub_stamp":"2026-03-31T20:31:21.304517+0000","last_clean_scrub_stamp":"2026-03-31T20:31:22.320510+0000","objects_scrubbed":2,"log_size":41,"log_dups_size":0,"ondisk_log_size":41,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-01T21:57:48.466412+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":106,"num_read_kb":213,"num_write":69,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":8,"num_bytes_recovered":1837120,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2],"acting":[0,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.3","version":"172'2","reported_seq":556,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.033656+0000","last_change":"2026-03-31T20:37:09.922257+0000","last_active":"2026-03-31T20:41:29.033656+0000","last_peered":"2026-03-31T20:41:29.033656+0000","last_clean":"2026-03-31T20:41:29.033656+0000","last_became_active":"2026-03-31T20:37:09.919661+0000","last_became_peered":"2026-03-31T20:37:09.919661+0000","last_unstale":"2026-03-31T20:41:29.033656+0000","last_undegraded":"2026-03-31T20:41:29.033656+0000","last_fullsized":"2026-03-31T20:41:29.033656+0000","mapping_epoch":709,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":710,"parent":"0.0","parent_split_bits":0,"last_scrub":"172'2","last_scrub_stamp":"2026-03-31T20:26:37.430905+0000","last_deep_scrub":"172'2","last_deep_scrub_stamp":"2026-03-31T20:26:31.466479+0000","last_clean_scrub_stamp":"2026-03-31T20:26:37.430905+0000","objects_scrubbed":1,"log_size":2,"log_dups_size":0,"ondisk_log_size":2,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T07:38:57.731689+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00057155900000000002,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":1,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,0],"acting":[2,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.0","version":"0'0","reported_seq":1651,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.034287+0000","last_change":"2026-03-31T20:37:09.919928+0000","last_active":"2026-03-31T20:41:29.034287+0000","last_peered":"2026-03-31T20:41:29.034287+0000","last_clean":"2026-03-31T20:41:29.034287+0000","last_became_active":"2026-03-31T20:37:09.919658+0000","last_became_peered":"2026-03-31T20:37:09.919658+0000","last_unstale":"2026-03-31T20:41:29.034287+0000","last_undegraded":"2026-03-31T20:41:29.034287+0000","last_fullsized":"2026-03-31T20:41:29.034287+0000","mapping_epoch":709,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":710,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:33.059473+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:31.028583+0000","last_clean_scrub_stamp":"2026-03-31T20:26:33.059473+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T00:03:43.041389+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00028135699999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,0],"acting":[2,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.1","version":"0'0","reported_seq":1645,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.032565+0000","last_change":"2026-03-31T20:26:34.047630+0000","last_active":"2026-03-31T20:41:29.032565+0000","last_peered":"2026-03-31T20:41:29.032565+0000","last_clean":"2026-03-31T20:41:29.032565+0000","last_became_active":"2026-03-31T20:21:33.036761+0000","last_became_peered":"2026-03-31T20:21:33.036761+0000","last_unstale":"2026-03-31T20:41:29.032565+0000","last_undegraded":"2026-03-31T20:41:29.032565+0000","last_fullsized":"2026-03-31T20:41:29.032565+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:34.047600+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:32.053167+0000","last_clean_scrub_stamp":"2026-03-31T20:26:34.047600+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T06:08:52.964865+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00033548699999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.2","version":"172'3","reported_seq":1570,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.047976+0000","last_change":"2026-03-31T20:29:18.927326+0000","last_active":"2026-03-31T20:41:29.047976+0000","last_peered":"2026-03-31T20:41:29.047976+0000","last_clean":"2026-03-31T20:41:29.047976+0000","last_became_active":"2026-03-31T20:29:18.927209+0000","last_became_peered":"2026-03-31T20:29:18.927209+0000","last_unstale":"2026-03-31T20:41:29.047976+0000","last_undegraded":"2026-03-31T20:41:29.047976+0000","last_fullsized":"2026-03-31T20:41:29.047976+0000","mapping_epoch":330,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":331,"parent":"0.0","parent_split_bits":0,"last_scrub":"172'3","last_scrub_stamp":"2026-03-31T20:26:31.889790+0000","last_deep_scrub":"172'3","last_deep_scrub_stamp":"2026-03-31T20:26:30.871302+0000","last_clean_scrub_stamp":"2026-03-31T20:26:31.889790+0000","objects_scrubbed":1,"log_size":3,"log_dups_size":0,"ondisk_log_size":3,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T01:50:50.550157+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00026154900000000003,"stat_sum":{"num_bytes":19,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":2,"num_bytes_recovered":38,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.4","version":"0'0","reported_seq":903,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.046320+0000","last_change":"2026-03-31T20:37:10.920859+0000","last_active":"2026-03-31T20:41:29.046320+0000","last_peered":"2026-03-31T20:41:29.046320+0000","last_clean":"2026-03-31T20:41:29.046320+0000","last_became_active":"2026-03-31T20:37:10.920712+0000","last_became_peered":"2026-03-31T20:37:10.920712+0000","last_unstale":"2026-03-31T20:41:29.046320+0000","last_undegraded":"2026-03-31T20:41:29.046320+0000","last_fullsized":"2026-03-31T20:41:29.046320+0000","mapping_epoch":710,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":711,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:39.414769+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:33.464380+0000","last_clean_scrub_stamp":"2026-03-31T20:26:39.414769+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T00:48:38.702799+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00049229099999999995,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2],"acting":[0,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.5","version":"0'0","reported_seq":1162,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.046003+0000","last_change":"2026-03-31T20:37:09.919126+0000","last_active":"2026-03-31T20:41:29.046003+0000","last_peered":"2026-03-31T20:41:29.046003+0000","last_clean":"2026-03-31T20:41:29.046003+0000","last_became_active":"2026-03-31T20:37:09.918910+0000","last_became_peered":"2026-03-31T20:37:09.918910+0000","last_unstale":"2026-03-31T20:41:29.046003+0000","last_undegraded":"2026-03-31T20:41:29.046003+0000","last_fullsized":"2026-03-31T20:41:29.046003+0000","mapping_epoch":709,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":710,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:38.458279+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:32.494594+0000","last_clean_scrub_stamp":"2026-03-31T20:26:38.458279+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T00:33:50.695920+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00042708900000000002,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2],"acting":[0,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.6","version":"0'0","reported_seq":1161,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.046113+0000","last_change":"2026-03-31T20:37:09.918790+0000","last_active":"2026-03-31T20:41:29.046113+0000","last_peered":"2026-03-31T20:41:29.046113+0000","last_clean":"2026-03-31T20:41:29.046113+0000","last_became_active":"2026-03-31T20:37:09.918690+0000","last_became_peered":"2026-03-31T20:37:09.918690+0000","last_unstale":"2026-03-31T20:41:29.046113+0000","last_undegraded":"2026-03-31T20:41:29.046113+0000","last_fullsized":"2026-03-31T20:41:29.046113+0000","mapping_epoch":709,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":710,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:41.404628+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:35.469461+0000","last_clean_scrub_stamp":"2026-03-31T20:26:41.404628+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T00:39:39.971833+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00038476,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2],"acting":[0,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.7","version":"0'0","reported_seq":1165,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.046047+0000","last_change":"2026-03-31T20:37:07.899092+0000","last_active":"2026-03-31T20:41:29.046047+0000","last_peered":"2026-03-31T20:41:29.046047+0000","last_clean":"2026-03-31T20:41:29.046047+0000","last_became_active":"2026-03-31T20:37:07.898875+0000","last_became_peered":"2026-03-31T20:37:07.898875+0000","last_unstale":"2026-03-31T20:41:29.046047+0000","last_undegraded":"2026-03-31T20:41:29.046047+0000","last_fullsized":"2026-03-31T20:41:29.046047+0000","mapping_epoch":707,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":708,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:40.426768+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:34.453503+0000","last_clean_scrub_stamp":"2026-03-31T20:26:40.426768+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T01:27:47.105587+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00037397899999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2],"acting":[0,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":3,"num_bytes_recovered":38,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":38,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":8405,"internal_metadata":0},"log_size":5,"ondisk_log_size":5,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":106,"num_read_kb":213,"num_write":69,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":8,"num_bytes_recovered":1837120,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":704512,"data_stored":1377840,"data_compressed":14356,"data_compressed_allocated":700416,"data_compressed_original":1373744,"omap_allocated":6380,"internal_metadata":0},"log_size":41,"ondisk_log_size":41,"up":2,"acting":2,"num_store_stats":3}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359738630,"num_pgs":364,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":31468,"kb_used_data":4636,"kb_used_omap":526,"kb_used_meta":26289,"kb_avail":94340372,"statfs":{"total":96636764160,"available":96604540928,"internally_reserved":0,"allocated":4747264,"data_stored":2437324,"data_compressed":49391,"data_compressed_allocated":372736,"data_compressed_original":648482,"omap_allocated":538982,"internal_metadata":26920602},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":2,"apply_latency_ms":2,"commit_latency_ns":2000000,"apply_latency_ns":2000000},"alerts":[]},{"osd":1,"up_from":8,"seq":34359738630,"num_pgs":93,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":31512,"kb_used_data":4424,"kb_used_omap":61,"kb_used_meta":27010,"kb_avail":94340328,"statfs":{"total":96636764160,"available":96604495872,"internally_reserved":0,"allocated":4530176,"data_stored":1982827,"data_compressed":44615,"data_compressed_allocated":139264,"data_compressed_original":189202,"omap_allocated":63487,"internal_metadata":27658241},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]},{"osd":0,"up_from":221,"seq":949187772678,"num_pgs":357,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":31472,"kb_used_data":4640,"kb_used_omap":411,"kb_used_meta":26405,"kb_avail":94340368,"statfs":{"total":96636764160,"available":96604536832,"internally_reserved":0,"allocated":4751360,"data_stored":2437343,"data_compressed":49391,"data_compressed_allocated":372736,"data_compressed_original":648482,"omap_allocated":420864,"internal_metadata":27038720},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":2,"apply_latency_ms":2,"commit_latency_ns":2000000,"apply_latency_ns":2000000},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":233472,"data_stored":459280,"data_compressed":4776,"data_compressed_allocated":233472,"data_compressed_original":459280,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":237568,"data_stored":459280,"data_compressed":4804,"data_compressed_allocated":233472,"data_compressed_original":455184,"omap_allocated":6380,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":233472,"data_stored":459280,"data_compressed":4776,"data_compressed_allocated":233472,"data_compressed_original":459280,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":3190,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":2503,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":2712,"internal_metadata":0}]}} 2026-03-31T20:41:32.927 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd dump --format=json 2026-03-31T20:41:33.085 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:41:33.085 INFO:teuthology.orchestra.run.vm03.stdout:{"epoch":811,"fsid":"a4a0ca01-ae82-443e-a7c7-50605716689a","created":"2026-03-31T20:21:23.960433+0000","modified":"2026-03-31T20:41:31.031227+0000","last_up_change":"2026-03-31T20:27:22.084074+0000","last_in_change":"2026-03-31T20:30:11.536237+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":27,"full_ratio":0.96200001239776611,"backfillfull_ratio":0.91200000047683716,"nearfull_ratio":0.89200001955032349,"cluster_snapshot":"","pool_max":55,"max_osd":3,"require_min_compat_client":"luminous","min_compat_client":"hammer","require_osd_release":"tentacle","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-31T20:21:28.536032+0000","flags":1,"flags_names":"hashpspool","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"449","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_type":"Fair distribution","score_acting":2.9900000095367432,"score_stable":2.9900000095367432,"optimal_score":0.67000001668930054,"raw_score_acting":2,"raw_score_stable":2,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}},{"pool":2,"pool_name":"rbd","create_time":"2026-03-31T20:21:31.856309+0000","flags":8193,"flags_names":"hashpspool,selfmanaged_snaps","type":1,"size":2,"min_size":1,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"is_stretch_pool":false,"object_hash":2,"pg_autoscale_mode":"off","pg_num":8,"pg_placement_num":8,"pg_placement_num_target":8,"pg_num_target":8,"pg_num_pending":8,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"695","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":2,"snap_epoch":635,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"nonprimary_shards":"{}","options":{},"application_metadata":{"rbd":{}},"read_balance":{"score_type":"Fair distribution","score_acting":1.8799999952316284,"score_stable":1.8799999952316284,"optimal_score":1,"raw_score_acting":1.8799999952316284,"raw_score_stable":1.8799999952316284,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"f3bff22c-a21c-470a-aec1-c158b49523b8","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":8,"last_clean_end":220,"up_from":221,"up_thru":807,"down_at":219,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6804","nonce":950776786}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6813","nonce":951776786}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6807","nonce":950776786}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6806","nonce":950776786}]},"public_addr":"192.168.123.103:6804/950776786","cluster_addr":"192.168.123.103:6813/951776786","heartbeat_back_addr":"192.168.123.103:6807/950776786","heartbeat_front_addr":"192.168.123.103:6806/950776786","state":["exists","up"]},{"osd":1,"uuid":"ea32c3bc-b054-4d75-ba06-731b384d7a6a","up":1,"in":1,"weight":0.355560302734375,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":807,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6800","nonce":3578452477}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6801","nonce":3578452477}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6803","nonce":3578452477}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6802","nonce":3578452477}]},"public_addr":"192.168.123.103:6800/3578452477","cluster_addr":"192.168.123.103:6801/3578452477","heartbeat_back_addr":"192.168.123.103:6803/3578452477","heartbeat_front_addr":"192.168.123.103:6802/3578452477","state":["exists","up"]},{"osd":2,"uuid":"b8c0450a-00e1-4ca8-b0ac-3f046a80f1b8","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":8,"up_thru":807,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6808","nonce":1523795840}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6809","nonce":1523795840}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6811","nonce":1523795840}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.103:6810","nonce":1523795840}]},"public_addr":"192.168.123.103:6808/1523795840","cluster_addr":"192.168.123.103:6809/1523795840","heartbeat_back_addr":"192.168.123.103:6811/1523795840","heartbeat_front_addr":"192.168.123.103:6810/1523795840","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"2026-03-31T20:27:20.071779+0000","laggy_probability":0.30000001192092896,"laggy_interval":0,"features":4541880224203014143,"old_weight":0,"last_purged_snaps_scrub":"2026-03-31T20:21:26.637032+0000","dead_epoch":219},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4541880224203014143,"old_weight":0,"last_purged_snaps_scrub":"2026-03-31T20:21:26.626364+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4541880224203014143,"old_weight":0,"last_purged_snaps_scrub":"2026-03-31T20:21:26.699584+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"isa","technique":"reed_sol_van"}},"removed_snaps_queue":[{"pool":26,"snaps":[{"begin":1,"length":1}]}],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-31T20:41:34.098 INFO:tasks.ceph:Scrubbing osd.0 2026-03-31T20:41:34.099 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.0 config set osd_debug_deep_scrub_sleep 0 2026-03-31T20:41:34.172 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-31T20:41:34.172 INFO:teuthology.orchestra.run.vm03.stdout: "success": "osd_debug_deep_scrub_sleep = '' (not observed, change may require restart) " 2026-03-31T20:41:34.172 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-31T20:41:34.181 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd deep-scrub 0 2026-03-31T20:41:34.331 INFO:teuthology.orchestra.run.vm03.stderr:instructed osd(s) 0 to deep-scrub 2026-03-31T20:41:34.343 INFO:tasks.ceph:Scrubbing osd.1 2026-03-31T20:41:34.343 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.1 config set osd_debug_deep_scrub_sleep 0 2026-03-31T20:41:34.416 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-31T20:41:34.416 INFO:teuthology.orchestra.run.vm03.stdout: "success": "osd_debug_deep_scrub_sleep = '' (not observed, change may require restart) osd_delete_sleep = '' osd_delete_sleep_hdd = '' osd_delete_sleep_hybrid = '' osd_delete_sleep_ssd = '' osd_max_backfills = '' osd_recovery_max_active = '' osd_recovery_max_active_hdd = '' osd_recovery_max_active_ssd = '' osd_recovery_sleep = '' osd_recovery_sleep_degraded = '' osd_recovery_sleep_degraded_hdd = '' osd_recovery_sleep_degraded_hybrid = '' osd_recovery_sleep_degraded_ssd = '' osd_recovery_sleep_hdd = '' osd_recovery_sleep_hybrid = '' osd_recovery_sleep_ssd = '' osd_scrub_sleep = '' osd_snap_trim_sleep = '' osd_snap_trim_sleep_hdd = '' osd_snap_trim_sleep_hybrid = '' osd_snap_trim_sleep_ssd = '' " 2026-03-31T20:41:34.416 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-31T20:41:34.425 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd deep-scrub 1 2026-03-31T20:41:34.575 INFO:teuthology.orchestra.run.vm03.stderr:instructed osd(s) 1 to deep-scrub 2026-03-31T20:41:34.587 INFO:tasks.ceph:Scrubbing osd.2 2026-03-31T20:41:34.587 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph tell osd.2 config set osd_debug_deep_scrub_sleep 0 2026-03-31T20:41:34.660 INFO:teuthology.orchestra.run.vm03.stdout:{ 2026-03-31T20:41:34.660 INFO:teuthology.orchestra.run.vm03.stdout: "success": "osd_debug_deep_scrub_sleep = '' (not observed, change may require restart) osd_delete_sleep = '' osd_delete_sleep_hdd = '' osd_delete_sleep_hybrid = '' osd_delete_sleep_ssd = '' osd_max_backfills = '' osd_recovery_max_active = '' osd_recovery_max_active_hdd = '' osd_recovery_max_active_ssd = '' osd_recovery_sleep = '' osd_recovery_sleep_degraded = '' osd_recovery_sleep_degraded_hdd = '' osd_recovery_sleep_degraded_hybrid = '' osd_recovery_sleep_degraded_ssd = '' osd_recovery_sleep_hdd = '' osd_recovery_sleep_hybrid = '' osd_recovery_sleep_ssd = '' osd_scrub_sleep = '' osd_snap_trim_sleep = '' osd_snap_trim_sleep_hdd = '' osd_snap_trim_sleep_hybrid = '' osd_snap_trim_sleep_ssd = '' " 2026-03-31T20:41:34.660 INFO:teuthology.orchestra.run.vm03.stdout:} 2026-03-31T20:41:34.669 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph osd deep-scrub 2 2026-03-31T20:41:34.821 INFO:teuthology.orchestra.run.vm03.stderr:instructed osd(s) 2 to deep-scrub 2026-03-31T20:41:34.832 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-31T20:41:34.981 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:41:34.981 INFO:teuthology.orchestra.run.vm03.stderr:dumped all 2026-03-31T20:41:34.993 INFO:teuthology.orchestra.run.vm03.stdout:{"pg_ready":true,"pg_map":{"version":1152,"stamp":"2026-03-31T20:41:33.740287+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459299,"num_objects":4,"num_object_clones":0,"num_object_copies":8,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":4,"num_whiteouts":0,"num_read":106,"num_read_kb":213,"num_write":71,"num_write_kb":586,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":11,"num_bytes_recovered":1837158,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":46,"ondisk_log_size":46,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":814,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":94452,"kb_used_data":13700,"kb_used_omap":999,"kb_used_meta":79704,"kb_avail":283021068,"statfs":{"total":289910292480,"available":289813573632,"internally_reserved":0,"allocated":14028800,"data_stored":6857494,"data_compressed":143397,"data_compressed_allocated":884736,"data_compressed_original":1486166,"omap_allocated":1023333,"internal_metadata":81617563},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":5,"apply_latency_ms":5,"commit_latency_ns":5000000,"apply_latency_ns":5000000},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"6.000974"},"pg_stats":[{"pgid":"1.0","version":"172'41","reported_seq":1291,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.046268+0000","last_change":"2026-03-31T20:37:10.228861+0000","last_active":"2026-03-31T20:41:29.046268+0000","last_peered":"2026-03-31T20:41:29.046268+0000","last_clean":"2026-03-31T20:41:29.046268+0000","last_became_active":"2026-03-31T20:37:09.919785+0000","last_became_peered":"2026-03-31T20:37:09.919785+0000","last_unstale":"2026-03-31T20:41:29.046268+0000","last_undegraded":"2026-03-31T20:41:29.046268+0000","last_fullsized":"2026-03-31T20:41:29.046268+0000","mapping_epoch":709,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":710,"parent":"0.0","parent_split_bits":0,"last_scrub":"172'41","last_scrub_stamp":"2026-03-31T20:31:22.320510+0000","last_deep_scrub":"172'41","last_deep_scrub_stamp":"2026-03-31T20:31:21.304517+0000","last_clean_scrub_stamp":"2026-03-31T20:31:22.320510+0000","objects_scrubbed":2,"log_size":41,"log_dups_size":0,"ondisk_log_size":41,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-01T21:57:48.466412+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":106,"num_read_kb":213,"num_write":69,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":8,"num_bytes_recovered":1837120,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2],"acting":[0,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.3","version":"172'2","reported_seq":556,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.033656+0000","last_change":"2026-03-31T20:37:09.922257+0000","last_active":"2026-03-31T20:41:29.033656+0000","last_peered":"2026-03-31T20:41:29.033656+0000","last_clean":"2026-03-31T20:41:29.033656+0000","last_became_active":"2026-03-31T20:37:09.919661+0000","last_became_peered":"2026-03-31T20:37:09.919661+0000","last_unstale":"2026-03-31T20:41:29.033656+0000","last_undegraded":"2026-03-31T20:41:29.033656+0000","last_fullsized":"2026-03-31T20:41:29.033656+0000","mapping_epoch":709,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":710,"parent":"0.0","parent_split_bits":0,"last_scrub":"172'2","last_scrub_stamp":"2026-03-31T20:26:37.430905+0000","last_deep_scrub":"172'2","last_deep_scrub_stamp":"2026-03-31T20:26:31.466479+0000","last_clean_scrub_stamp":"2026-03-31T20:26:37.430905+0000","objects_scrubbed":1,"log_size":2,"log_dups_size":0,"ondisk_log_size":2,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T07:38:57.731689+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00057155900000000002,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":1,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,0],"acting":[2,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.0","version":"0'0","reported_seq":1651,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.034287+0000","last_change":"2026-03-31T20:37:09.919928+0000","last_active":"2026-03-31T20:41:29.034287+0000","last_peered":"2026-03-31T20:41:29.034287+0000","last_clean":"2026-03-31T20:41:29.034287+0000","last_became_active":"2026-03-31T20:37:09.919658+0000","last_became_peered":"2026-03-31T20:37:09.919658+0000","last_unstale":"2026-03-31T20:41:29.034287+0000","last_undegraded":"2026-03-31T20:41:29.034287+0000","last_fullsized":"2026-03-31T20:41:29.034287+0000","mapping_epoch":709,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":710,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:33.059473+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:31.028583+0000","last_clean_scrub_stamp":"2026-03-31T20:26:33.059473+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T00:03:43.041389+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00028135699999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,0],"acting":[2,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.1","version":"0'0","reported_seq":1645,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.032565+0000","last_change":"2026-03-31T20:26:34.047630+0000","last_active":"2026-03-31T20:41:29.032565+0000","last_peered":"2026-03-31T20:41:29.032565+0000","last_clean":"2026-03-31T20:41:29.032565+0000","last_became_active":"2026-03-31T20:21:33.036761+0000","last_became_peered":"2026-03-31T20:21:33.036761+0000","last_unstale":"2026-03-31T20:41:29.032565+0000","last_undegraded":"2026-03-31T20:41:29.032565+0000","last_fullsized":"2026-03-31T20:41:29.032565+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:34.047600+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:32.053167+0000","last_clean_scrub_stamp":"2026-03-31T20:26:34.047600+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T06:08:52.964865+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00033548699999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.2","version":"172'3","reported_seq":1570,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.047976+0000","last_change":"2026-03-31T20:29:18.927326+0000","last_active":"2026-03-31T20:41:29.047976+0000","last_peered":"2026-03-31T20:41:29.047976+0000","last_clean":"2026-03-31T20:41:29.047976+0000","last_became_active":"2026-03-31T20:29:18.927209+0000","last_became_peered":"2026-03-31T20:29:18.927209+0000","last_unstale":"2026-03-31T20:41:29.047976+0000","last_undegraded":"2026-03-31T20:41:29.047976+0000","last_fullsized":"2026-03-31T20:41:29.047976+0000","mapping_epoch":330,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":331,"parent":"0.0","parent_split_bits":0,"last_scrub":"172'3","last_scrub_stamp":"2026-03-31T20:26:31.889790+0000","last_deep_scrub":"172'3","last_deep_scrub_stamp":"2026-03-31T20:26:30.871302+0000","last_clean_scrub_stamp":"2026-03-31T20:26:31.889790+0000","objects_scrubbed":1,"log_size":3,"log_dups_size":0,"ondisk_log_size":3,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T01:50:50.550157+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00026154900000000003,"stat_sum":{"num_bytes":19,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":2,"num_bytes_recovered":38,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.4","version":"0'0","reported_seq":903,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.046320+0000","last_change":"2026-03-31T20:37:10.920859+0000","last_active":"2026-03-31T20:41:29.046320+0000","last_peered":"2026-03-31T20:41:29.046320+0000","last_clean":"2026-03-31T20:41:29.046320+0000","last_became_active":"2026-03-31T20:37:10.920712+0000","last_became_peered":"2026-03-31T20:37:10.920712+0000","last_unstale":"2026-03-31T20:41:29.046320+0000","last_undegraded":"2026-03-31T20:41:29.046320+0000","last_fullsized":"2026-03-31T20:41:29.046320+0000","mapping_epoch":710,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":711,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:39.414769+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:33.464380+0000","last_clean_scrub_stamp":"2026-03-31T20:26:39.414769+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T00:48:38.702799+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00049229099999999995,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2],"acting":[0,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.5","version":"0'0","reported_seq":1162,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.046003+0000","last_change":"2026-03-31T20:37:09.919126+0000","last_active":"2026-03-31T20:41:29.046003+0000","last_peered":"2026-03-31T20:41:29.046003+0000","last_clean":"2026-03-31T20:41:29.046003+0000","last_became_active":"2026-03-31T20:37:09.918910+0000","last_became_peered":"2026-03-31T20:37:09.918910+0000","last_unstale":"2026-03-31T20:41:29.046003+0000","last_undegraded":"2026-03-31T20:41:29.046003+0000","last_fullsized":"2026-03-31T20:41:29.046003+0000","mapping_epoch":709,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":710,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:38.458279+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:32.494594+0000","last_clean_scrub_stamp":"2026-03-31T20:26:38.458279+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T00:33:50.695920+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00042708900000000002,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2],"acting":[0,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.6","version":"0'0","reported_seq":1161,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.046113+0000","last_change":"2026-03-31T20:37:09.918790+0000","last_active":"2026-03-31T20:41:29.046113+0000","last_peered":"2026-03-31T20:41:29.046113+0000","last_clean":"2026-03-31T20:41:29.046113+0000","last_became_active":"2026-03-31T20:37:09.918690+0000","last_became_peered":"2026-03-31T20:37:09.918690+0000","last_unstale":"2026-03-31T20:41:29.046113+0000","last_undegraded":"2026-03-31T20:41:29.046113+0000","last_fullsized":"2026-03-31T20:41:29.046113+0000","mapping_epoch":709,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":710,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:41.404628+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:35.469461+0000","last_clean_scrub_stamp":"2026-03-31T20:26:41.404628+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T00:39:39.971833+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00038476,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2],"acting":[0,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.7","version":"0'0","reported_seq":1165,"reported_epoch":809,"state":"active+clean","last_fresh":"2026-03-31T20:41:29.046047+0000","last_change":"2026-03-31T20:37:07.899092+0000","last_active":"2026-03-31T20:41:29.046047+0000","last_peered":"2026-03-31T20:41:29.046047+0000","last_clean":"2026-03-31T20:41:29.046047+0000","last_became_active":"2026-03-31T20:37:07.898875+0000","last_became_peered":"2026-03-31T20:37:07.898875+0000","last_unstale":"2026-03-31T20:41:29.046047+0000","last_undegraded":"2026-03-31T20:41:29.046047+0000","last_fullsized":"2026-03-31T20:41:29.046047+0000","mapping_epoch":707,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":708,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:26:40.426768+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:26:34.453503+0000","last_clean_scrub_stamp":"2026-03-31T20:26:40.426768+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T01:27:47.105587+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00037397899999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2],"acting":[0,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":3,"num_bytes_recovered":38,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":38,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":8405,"internal_metadata":0},"log_size":5,"ondisk_log_size":5,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":106,"num_read_kb":213,"num_write":69,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":8,"num_bytes_recovered":1837120,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":704512,"data_stored":1377840,"data_compressed":14356,"data_compressed_allocated":700416,"data_compressed_original":1373744,"omap_allocated":6380,"internal_metadata":0},"log_size":41,"ondisk_log_size":41,"up":2,"acting":2,"num_store_stats":3}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359738630,"num_pgs":364,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":31468,"kb_used_data":4636,"kb_used_omap":526,"kb_used_meta":26289,"kb_avail":94340372,"statfs":{"total":96636764160,"available":96604540928,"internally_reserved":0,"allocated":4747264,"data_stored":2437324,"data_compressed":49391,"data_compressed_allocated":372736,"data_compressed_original":648482,"omap_allocated":538982,"internal_metadata":26920602},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":2,"apply_latency_ms":2,"commit_latency_ns":2000000,"apply_latency_ns":2000000},"alerts":[]},{"osd":1,"up_from":8,"seq":34359738630,"num_pgs":93,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":31512,"kb_used_data":4424,"kb_used_omap":61,"kb_used_meta":27010,"kb_avail":94340328,"statfs":{"total":96636764160,"available":96604495872,"internally_reserved":0,"allocated":4530176,"data_stored":1982827,"data_compressed":44615,"data_compressed_allocated":139264,"data_compressed_original":189202,"omap_allocated":63487,"internal_metadata":27658241},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":1,"apply_latency_ms":1,"commit_latency_ns":1000000,"apply_latency_ns":1000000},"alerts":[]},{"osd":0,"up_from":221,"seq":949187772678,"num_pgs":357,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":31472,"kb_used_data":4640,"kb_used_omap":411,"kb_used_meta":26405,"kb_avail":94340368,"statfs":{"total":96636764160,"available":96604536832,"internally_reserved":0,"allocated":4751360,"data_stored":2437343,"data_compressed":49391,"data_compressed_allocated":372736,"data_compressed_original":648482,"omap_allocated":420864,"internal_metadata":27038720},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":2,"apply_latency_ms":2,"commit_latency_ns":2000000,"apply_latency_ns":2000000},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":233472,"data_stored":459280,"data_compressed":4776,"data_compressed_allocated":233472,"data_compressed_original":459280,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":237568,"data_stored":459280,"data_compressed":4804,"data_compressed_allocated":233472,"data_compressed_original":455184,"omap_allocated":6380,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":233472,"data_stored":459280,"data_compressed":4776,"data_compressed_allocated":233472,"data_compressed_original":459280,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":3190,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":2503,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":2712,"internal_metadata":0}]}} 2026-03-31T20:41:34.994 INFO:tasks.ceph:pgid 1.0 last_scrub_stamp 2026-03-31T20:31:22.320510+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=31, tm_hour=20, tm_min=31, tm_sec=22, tm_wday=1, tm_yday=90, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=31, tm_hour=20, tm_min=41, tm_sec=33, tm_wday=1, tm_yday=90, tm_isdst=0) 2026-03-31T20:41:34.994 INFO:tasks.ceph:pgid 2.3 last_scrub_stamp 2026-03-31T20:26:37.430905+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=31, tm_hour=20, tm_min=26, tm_sec=37, tm_wday=1, tm_yday=90, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=31, tm_hour=20, tm_min=41, tm_sec=33, tm_wday=1, tm_yday=90, tm_isdst=0) 2026-03-31T20:41:34.994 INFO:tasks.ceph:pgid 2.0 last_scrub_stamp 2026-03-31T20:26:33.059473+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=31, tm_hour=20, tm_min=26, tm_sec=33, tm_wday=1, tm_yday=90, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=31, tm_hour=20, tm_min=41, tm_sec=33, tm_wday=1, tm_yday=90, tm_isdst=0) 2026-03-31T20:41:34.994 INFO:tasks.ceph:pgid 2.1 last_scrub_stamp 2026-03-31T20:26:34.047600+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=31, tm_hour=20, tm_min=26, tm_sec=34, tm_wday=1, tm_yday=90, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=31, tm_hour=20, tm_min=41, tm_sec=33, tm_wday=1, tm_yday=90, tm_isdst=0) 2026-03-31T20:41:34.994 INFO:tasks.ceph:pgid 2.2 last_scrub_stamp 2026-03-31T20:26:31.889790+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=31, tm_hour=20, tm_min=26, tm_sec=31, tm_wday=1, tm_yday=90, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=31, tm_hour=20, tm_min=41, tm_sec=33, tm_wday=1, tm_yday=90, tm_isdst=0) 2026-03-31T20:41:34.994 INFO:tasks.ceph:pgid 2.4 last_scrub_stamp 2026-03-31T20:26:39.414769+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=31, tm_hour=20, tm_min=26, tm_sec=39, tm_wday=1, tm_yday=90, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=31, tm_hour=20, tm_min=41, tm_sec=33, tm_wday=1, tm_yday=90, tm_isdst=0) 2026-03-31T20:41:34.994 INFO:tasks.ceph:pgid 2.5 last_scrub_stamp 2026-03-31T20:26:38.458279+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=31, tm_hour=20, tm_min=26, tm_sec=38, tm_wday=1, tm_yday=90, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=31, tm_hour=20, tm_min=41, tm_sec=33, tm_wday=1, tm_yday=90, tm_isdst=0) 2026-03-31T20:41:34.994 INFO:tasks.ceph:pgid 2.6 last_scrub_stamp 2026-03-31T20:26:41.404628+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=31, tm_hour=20, tm_min=26, tm_sec=41, tm_wday=1, tm_yday=90, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=31, tm_hour=20, tm_min=41, tm_sec=33, tm_wday=1, tm_yday=90, tm_isdst=0) 2026-03-31T20:41:34.994 INFO:tasks.ceph:pgid 2.7 last_scrub_stamp 2026-03-31T20:26:40.426768+0000 time.struct_time(tm_year=2026, tm_mon=3, tm_mday=31, tm_hour=20, tm_min=26, tm_sec=40, tm_wday=1, tm_yday=90, tm_isdst=-1) <= time.struct_time(tm_year=2026, tm_mon=3, tm_mday=31, tm_hour=20, tm_min=41, tm_sec=33, tm_wday=1, tm_yday=90, tm_isdst=0) 2026-03-31T20:41:34.994 INFO:tasks.ceph:Still waiting for all pgs to be scrubbed. 2026-03-31T20:41:54.995 DEBUG:teuthology.orchestra.run.vm03:> sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 120 ceph --cluster ceph pg dump --format=json 2026-03-31T20:41:55.145 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-31T20:41:55.145 INFO:teuthology.orchestra.run.vm03.stderr:dumped all 2026-03-31T20:41:55.157 INFO:teuthology.orchestra.run.vm03.stdout:{"pg_ready":true,"pg_map":{"version":1162,"stamp":"2026-03-31T20:41:53.742598+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459299,"num_objects":4,"num_object_clones":0,"num_object_copies":8,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":4,"num_whiteouts":0,"num_read":106,"num_read_kb":213,"num_write":71,"num_write_kb":586,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":11,"num_bytes_recovered":1837158,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":46,"ondisk_log_size":46,"up":18,"acting":18,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":18,"num_osds":3,"num_per_pool_osds":3,"num_per_pool_omap_osds":3,"kb":283115520,"kb_used":94512,"kb_used_data":13760,"kb_used_omap":1003,"kb_used_meta":79700,"kb_avail":283021008,"statfs":{"total":289910292480,"available":289813512192,"internally_reserved":0,"allocated":14090240,"data_stored":6873802,"data_compressed":143397,"data_compressed_allocated":884736,"data_compressed_original":1486166,"omap_allocated":1027325,"internal_metadata":81613571},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"12.001368"},"pg_stats":[{"pgid":"1.0","version":"172'41","reported_seq":1303,"reported_epoch":811,"state":"active+clean","last_fresh":"2026-03-31T20:41:35.297248+0000","last_change":"2026-03-31T20:41:35.297248+0000","last_active":"2026-03-31T20:41:35.297248+0000","last_peered":"2026-03-31T20:41:35.297248+0000","last_clean":"2026-03-31T20:41:35.297248+0000","last_became_active":"2026-03-31T20:37:09.919785+0000","last_became_peered":"2026-03-31T20:37:09.919785+0000","last_unstale":"2026-03-31T20:41:35.297248+0000","last_undegraded":"2026-03-31T20:41:35.297248+0000","last_fullsized":"2026-03-31T20:41:35.297248+0000","mapping_epoch":709,"log_start":"0'0","ondisk_log_start":"0'0","created":9,"last_epoch_clean":710,"parent":"0.0","parent_split_bits":0,"last_scrub":"172'41","last_scrub_stamp":"2026-03-31T20:41:35.297213+0000","last_deep_scrub":"172'41","last_deep_scrub_stamp":"2026-03-31T20:41:35.297213+0000","last_clean_scrub_stamp":"2026-03-31T20:41:35.297213+0000","objects_scrubbed":2,"log_size":41,"log_dups_size":0,"ondisk_log_size":41,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-01T23:28:05.260875+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":106,"num_read_kb":213,"num_write":69,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":8,"num_bytes_recovered":1837120,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2],"acting":[0,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.3","version":"172'2","reported_seq":568,"reported_epoch":811,"state":"active+clean","last_fresh":"2026-03-31T20:41:37.149539+0000","last_change":"2026-03-31T20:41:37.149539+0000","last_active":"2026-03-31T20:41:37.149539+0000","last_peered":"2026-03-31T20:41:37.149539+0000","last_clean":"2026-03-31T20:41:37.149539+0000","last_became_active":"2026-03-31T20:37:09.919661+0000","last_became_peered":"2026-03-31T20:37:09.919661+0000","last_unstale":"2026-03-31T20:41:37.149539+0000","last_undegraded":"2026-03-31T20:41:37.149539+0000","last_fullsized":"2026-03-31T20:41:37.149539+0000","mapping_epoch":709,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":710,"parent":"0.0","parent_split_bits":0,"last_scrub":"172'2","last_scrub_stamp":"2026-03-31T20:41:37.149497+0000","last_deep_scrub":"172'2","last_deep_scrub_stamp":"2026-03-31T20:41:37.149497+0000","last_clean_scrub_stamp":"2026-03-31T20:41:37.149497+0000","objects_scrubbed":1,"log_size":2,"log_dups_size":0,"ondisk_log_size":2,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T06:37:36.782025+0000","scrub_duration":4,"objects_trimmed":0,"snaptrim_duration":0.00057155900000000002,"stat_sum":{"num_bytes":0,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":1,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,0],"acting":[2,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.0","version":"0'0","reported_seq":1664,"reported_epoch":811,"state":"active+clean","last_fresh":"2026-03-31T20:41:36.167247+0000","last_change":"2026-03-31T20:41:36.167247+0000","last_active":"2026-03-31T20:41:36.167247+0000","last_peered":"2026-03-31T20:41:36.167247+0000","last_clean":"2026-03-31T20:41:36.167247+0000","last_became_active":"2026-03-31T20:37:09.919658+0000","last_became_peered":"2026-03-31T20:37:09.919658+0000","last_unstale":"2026-03-31T20:41:36.167247+0000","last_undegraded":"2026-03-31T20:41:36.167247+0000","last_fullsized":"2026-03-31T20:41:36.167247+0000","mapping_epoch":709,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":710,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:41:36.167210+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:41:36.167210+0000","last_clean_scrub_stamp":"2026-03-31T20:41:36.167210+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T05:28:31.151406+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00028135699999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,0],"acting":[2,0],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.1","version":"0'0","reported_seq":1657,"reported_epoch":811,"state":"active+clean","last_fresh":"2026-03-31T20:41:35.213668+0000","last_change":"2026-03-31T20:41:35.213668+0000","last_active":"2026-03-31T20:41:35.213668+0000","last_peered":"2026-03-31T20:41:35.213668+0000","last_clean":"2026-03-31T20:41:35.213668+0000","last_became_active":"2026-03-31T20:21:33.036761+0000","last_became_peered":"2026-03-31T20:21:33.036761+0000","last_unstale":"2026-03-31T20:41:35.213668+0000","last_undegraded":"2026-03-31T20:41:35.213668+0000","last_fullsized":"2026-03-31T20:41:35.213668+0000","mapping_epoch":12,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":13,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:41:35.213628+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:41:35.213628+0000","last_clean_scrub_stamp":"2026-03-31T20:41:35.213628+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":1,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T07:34:44.436608+0000","scrub_duration":4,"objects_trimmed":0,"snaptrim_duration":0.00033548699999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[2,1],"acting":[2,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":2,"acting_primary":2,"purged_snaps":[]},{"pgid":"2.2","version":"172'3","reported_seq":1582,"reported_epoch":811,"state":"active+clean","last_fresh":"2026-03-31T20:41:36.281117+0000","last_change":"2026-03-31T20:41:36.281117+0000","last_active":"2026-03-31T20:41:36.281117+0000","last_peered":"2026-03-31T20:41:36.281117+0000","last_clean":"2026-03-31T20:41:36.281117+0000","last_became_active":"2026-03-31T20:29:18.927209+0000","last_became_peered":"2026-03-31T20:29:18.927209+0000","last_unstale":"2026-03-31T20:41:36.281117+0000","last_undegraded":"2026-03-31T20:41:36.281117+0000","last_fullsized":"2026-03-31T20:41:36.281117+0000","mapping_epoch":330,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":331,"parent":"0.0","parent_split_bits":0,"last_scrub":"172'3","last_scrub_stamp":"2026-03-31T20:41:36.281073+0000","last_deep_scrub":"172'3","last_deep_scrub_stamp":"2026-03-31T20:41:36.281073+0000","last_clean_scrub_stamp":"2026-03-31T20:41:36.281073+0000","objects_scrubbed":1,"log_size":3,"log_dups_size":0,"ondisk_log_size":3,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-01T23:49:37.919826+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00026154900000000003,"stat_sum":{"num_bytes":19,"num_objects":1,"num_object_clones":0,"num_object_copies":2,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":1,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":2,"num_bytes_recovered":38,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,1],"acting":[0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.4","version":"0'0","reported_seq":915,"reported_epoch":811,"state":"active+clean","last_fresh":"2026-03-31T20:41:37.235761+0000","last_change":"2026-03-31T20:41:37.235761+0000","last_active":"2026-03-31T20:41:37.235761+0000","last_peered":"2026-03-31T20:41:37.235761+0000","last_clean":"2026-03-31T20:41:37.235761+0000","last_became_active":"2026-03-31T20:37:10.920712+0000","last_became_peered":"2026-03-31T20:37:10.920712+0000","last_unstale":"2026-03-31T20:41:37.235761+0000","last_undegraded":"2026-03-31T20:41:37.235761+0000","last_fullsized":"2026-03-31T20:41:37.235761+0000","mapping_epoch":710,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":711,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:41:37.235722+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:41:37.235722+0000","last_clean_scrub_stamp":"2026-03-31T20:41:37.235722+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T01:21:12.117544+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00049229099999999995,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2],"acting":[0,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.5","version":"0'0","reported_seq":1174,"reported_epoch":811,"state":"active+clean","last_fresh":"2026-03-31T20:41:38.219537+0000","last_change":"2026-03-31T20:41:38.219537+0000","last_active":"2026-03-31T20:41:38.219537+0000","last_peered":"2026-03-31T20:41:38.219537+0000","last_clean":"2026-03-31T20:41:38.219537+0000","last_became_active":"2026-03-31T20:37:09.918910+0000","last_became_peered":"2026-03-31T20:37:09.918910+0000","last_unstale":"2026-03-31T20:41:38.219537+0000","last_undegraded":"2026-03-31T20:41:38.219537+0000","last_fullsized":"2026-03-31T20:41:38.219537+0000","mapping_epoch":709,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":710,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:41:38.219502+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:41:38.219502+0000","last_clean_scrub_stamp":"2026-03-31T20:41:38.219502+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T07:14:38.124216+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00042708900000000002,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2],"acting":[0,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.6","version":"0'0","reported_seq":1173,"reported_epoch":811,"state":"active+clean","last_fresh":"2026-03-31T20:41:39.214593+0000","last_change":"2026-03-31T20:41:39.214593+0000","last_active":"2026-03-31T20:41:39.214593+0000","last_peered":"2026-03-31T20:41:39.214593+0000","last_clean":"2026-03-31T20:41:39.214593+0000","last_became_active":"2026-03-31T20:37:09.918690+0000","last_became_peered":"2026-03-31T20:37:09.918690+0000","last_unstale":"2026-03-31T20:41:39.214593+0000","last_undegraded":"2026-03-31T20:41:39.214593+0000","last_fullsized":"2026-03-31T20:41:39.214593+0000","mapping_epoch":709,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":710,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:41:39.214553+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:41:39.214553+0000","last_clean_scrub_stamp":"2026-03-31T20:41:39.214553+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-02T04:12:57.455199+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00038476,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2],"acting":[0,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]},{"pgid":"2.7","version":"0'0","reported_seq":1177,"reported_epoch":811,"state":"active+clean","last_fresh":"2026-03-31T20:41:40.194593+0000","last_change":"2026-03-31T20:41:40.194593+0000","last_active":"2026-03-31T20:41:40.194593+0000","last_peered":"2026-03-31T20:41:40.194593+0000","last_clean":"2026-03-31T20:41:40.194593+0000","last_became_active":"2026-03-31T20:37:07.898875+0000","last_became_peered":"2026-03-31T20:37:07.898875+0000","last_unstale":"2026-03-31T20:41:40.194593+0000","last_undegraded":"2026-03-31T20:41:40.194593+0000","last_fullsized":"2026-03-31T20:41:40.194593+0000","mapping_epoch":707,"log_start":"0'0","ondisk_log_start":"0'0","created":12,"last_epoch_clean":708,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-31T20:41:40.194542+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-31T20:41:40.194542+0000","last_clean_scrub_stamp":"2026-03-31T20:41:40.194542+0000","objects_scrubbed":0,"log_size":0,"log_dups_size":0,"ondisk_log_size":0,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-04-01T22:06:36.803452+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0.00037397899999999998,"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[0,2],"acting":[0,2],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":0,"acting_primary":0,"purged_snaps":[]}],"pool_stats":[{"poolid":2,"num_pg":8,"stat_sum":{"num_bytes":19,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":2,"num_write_kb":2,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":3,"num_bytes_recovered":38,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":8192,"data_stored":38,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":8573,"internal_metadata":0},"log_size":5,"ondisk_log_size":5,"up":16,"acting":16,"num_store_stats":3},{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":4,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":106,"num_read_kb":213,"num_write":69,"num_write_kb":584,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":8,"num_bytes_recovered":1837120,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":704512,"data_stored":1377840,"data_compressed":14356,"data_compressed_allocated":700416,"data_compressed_original":1373744,"omap_allocated":6380,"internal_metadata":0},"log_size":41,"ondisk_log_size":41,"up":2,"acting":2,"num_store_stats":3}],"osd_stats":[{"osd":2,"up_from":8,"seq":34359738634,"num_pgs":8,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":31492,"kb_used_data":4660,"kb_used_omap":527,"kb_used_meta":26288,"kb_avail":94340348,"statfs":{"total":96636764160,"available":96604516352,"internally_reserved":0,"allocated":4771840,"data_stored":2444348,"data_compressed":49391,"data_compressed_allocated":372736,"data_compressed_original":648482,"omap_allocated":540366,"internal_metadata":26919218},"hb_peers":[0,1],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":1,"up_from":8,"seq":34359738634,"num_pgs":2,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":31524,"kb_used_data":4436,"kb_used_omap":64,"kb_used_meta":27007,"kb_avail":94340316,"statfs":{"total":96636764160,"available":96604483584,"internally_reserved":0,"allocated":4542464,"data_stored":1985087,"data_compressed":44615,"data_compressed_allocated":139264,"data_compressed_original":189202,"omap_allocated":66095,"internal_metadata":27655633},"hb_peers":[0,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]},{"osd":0,"up_from":221,"seq":949187772682,"num_pgs":8,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":94371840,"kb_used":31496,"kb_used_data":4664,"kb_used_omap":411,"kb_used_meta":26405,"kb_avail":94340344,"statfs":{"total":96636764160,"available":96604512256,"internally_reserved":0,"allocated":4775936,"data_stored":2444367,"data_compressed":49391,"data_compressed_allocated":372736,"data_compressed_original":648482,"omap_allocated":420864,"internal_metadata":27038720},"hb_peers":[1,2],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":233472,"data_stored":459280,"data_compressed":4776,"data_compressed_allocated":233472,"data_compressed_original":459280,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":237568,"data_stored":459280,"data_compressed":4804,"data_compressed_allocated":233472,"data_compressed_original":455184,"omap_allocated":6380,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":233472,"data_stored":459280,"data_compressed":4776,"data_compressed_allocated":233472,"data_compressed_original":459280,"omap_allocated":0,"internal_metadata":0},{"poolid":2,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":3190,"internal_metadata":0},{"poolid":2,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":4096,"data_stored":19,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":2503,"internal_metadata":0},{"poolid":2,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":2880,"internal_metadata":0}]}} 2026-03-31T20:41:55.157 DEBUG:teuthology.orchestra.run.vm03:> sudo ceph --cluster ceph config set global mon_health_to_clog false 2026-03-31T20:41:55.325 INFO:teuthology.misc:Shutting down mds daemons... 2026-03-31T20:41:55.325 INFO:teuthology.misc:Shutting down osd daemons... 2026-03-31T20:41:55.325 DEBUG:tasks.ceph.osd.0:waiting for process to exit 2026-03-31T20:41:55.325 INFO:teuthology.orchestra.run:waiting for 300 2026-03-31T20:41:55.374 INFO:tasks.ceph.osd.0:Stopped 2026-03-31T20:41:55.374 DEBUG:tasks.ceph.osd.1:waiting for process to exit 2026-03-31T20:41:55.375 INFO:teuthology.orchestra.run:waiting for 300 2026-03-31T20:41:55.421 INFO:tasks.ceph.osd.1:Stopped 2026-03-31T20:41:55.421 DEBUG:tasks.ceph.osd.2:waiting for process to exit 2026-03-31T20:41:55.421 INFO:teuthology.orchestra.run:waiting for 300 2026-03-31T20:41:55.485 INFO:tasks.ceph.osd.2:Stopped 2026-03-31T20:41:55.486 INFO:teuthology.misc:Shutting down mgr daemons... 2026-03-31T20:41:55.486 DEBUG:tasks.ceph.mgr.x:waiting for process to exit 2026-03-31T20:41:55.486 INFO:teuthology.orchestra.run:waiting for 300 2026-03-31T20:41:55.511 INFO:tasks.ceph.mgr.x:Stopped 2026-03-31T20:41:55.511 INFO:teuthology.misc:Shutting down mon daemons... 2026-03-31T20:41:55.511 DEBUG:tasks.ceph.mon.a:waiting for process to exit 2026-03-31T20:41:55.511 INFO:teuthology.orchestra.run:waiting for 300 2026-03-31T20:41:55.563 INFO:tasks.ceph.mon.a:Stopped 2026-03-31T20:41:55.563 DEBUG:tasks.ceph.mon.b:waiting for process to exit 2026-03-31T20:41:55.563 INFO:teuthology.orchestra.run:waiting for 300 2026-03-31T20:41:55.613 INFO:tasks.ceph.mon.b:Stopped 2026-03-31T20:41:55.613 DEBUG:tasks.ceph.mon.c:waiting for process to exit 2026-03-31T20:41:55.613 INFO:teuthology.orchestra.run:waiting for 300 2026-03-31T20:41:55.665 INFO:tasks.daemonwatchdog.daemon_watchdog:daemon ceph.osd.2 is failed for ~0s 2026-03-31T20:41:55.665 INFO:tasks.daemonwatchdog.daemon_watchdog:daemon ceph.mon.a is failed for ~0s 2026-03-31T20:41:55.665 INFO:tasks.daemonwatchdog.daemon_watchdog:daemon ceph.mon.b is failed for ~0s 2026-03-31T20:41:55.665 INFO:tasks.daemonwatchdog.daemon_watchdog:daemon ceph.mon.c is failed for ~0s 2026-03-31T20:41:55.665 INFO:tasks.ceph.mon.c:Stopped 2026-03-31T20:41:55.665 INFO:tasks.ceph:Checking cluster log for badness... 2026-03-31T20:41:55.665 DEBUG:teuthology.orchestra.run.vm03:> sudo egrep '\[ERR\]|\[WRN\]|\[SEC\]' /var/log/ceph/ceph.log | egrep -v 'but it is still running' | egrep -v 'had wrong client addr' | egrep -v 'had wrong cluster addr' | egrep -v 'must scrub before tier agent can activate' | egrep -v 'failsafe engaged, dropping updates' | egrep -v 'failsafe disengaged, no longer dropping updates' | egrep -v 'overall HEALTH_' | egrep -v '\(OSDMAP_FLAGS\)' | egrep -v '\(OSD_' | egrep -v '\(PG_' | egrep -v '\(SMALLER_PG_NUM\)' | egrep -v '\(SMALLER_PGP_NUM\)' | egrep -v '\(CACHE_POOL_NO_HIT_SET\)' | egrep -v '\(CACHE_POOL_NEAR_FULL\)' | egrep -v '\(FS_WITH_FAILED_MDS\)' | egrep -v '\(FS_DEGRADED\)' | egrep -v '\(POOL_BACKFILLFULL\)' | egrep -v '\(POOL_FULL\)' | egrep -v '\(SMALLER_PGP_NUM\)' | egrep -v '\(POOL_NEARFULL\)' | egrep -v '\(POOL_APP_NOT_ENABLED\)' | egrep -v '\(AUTH_BAD_CAPS\)' | egrep -v '\(FS_INLINE_DATA_DEPRECATED\)' | egrep -v '\(MON_DOWN\)' | egrep -v '\(SLOW_OPS\)' | egrep -v 'slow request' | egrep -v '\(MDS_ALL_DOWN\)' | egrep -v '\(MDS_UP_LESS_THAN_MAX\)' | head -n 1 2026-03-31T20:41:55.720 INFO:tasks.ceph:Unmounting /var/lib/ceph/osd/ceph-0 on ubuntu@vm03.local 2026-03-31T20:41:55.720 DEBUG:teuthology.orchestra.run.vm03:> sync && sudo umount -f /var/lib/ceph/osd/ceph-0 2026-03-31T20:41:55.780 INFO:tasks.ceph:Unmounting /var/lib/ceph/osd/ceph-1 on ubuntu@vm03.local 2026-03-31T20:41:55.780 DEBUG:teuthology.orchestra.run.vm03:> sync && sudo umount -f /var/lib/ceph/osd/ceph-1 2026-03-31T20:41:55.836 INFO:tasks.ceph:Unmounting /var/lib/ceph/osd/ceph-2 on ubuntu@vm03.local 2026-03-31T20:41:55.836 DEBUG:teuthology.orchestra.run.vm03:> sync && sudo umount -f /var/lib/ceph/osd/ceph-2 2026-03-31T20:41:55.891 INFO:tasks.ceph:Archiving mon data... 2026-03-31T20:41:55.891 DEBUG:teuthology.misc:Transferring archived files from vm03:/var/lib/ceph/mon/ceph-a to /archive/kyr-2026-03-31_11:18:10-rados-tentacle-none-default-vps/4344/data/mon.a.tgz 2026-03-31T20:41:55.891 DEBUG:teuthology.orchestra.run.vm03:> mktemp 2026-03-31T20:41:55.934 INFO:teuthology.orchestra.run.vm03.stdout:/tmp/tmp.ZjVEtHErVd 2026-03-31T20:41:55.934 DEBUG:teuthology.orchestra.run.vm03:> sudo tar cz -f - -C /var/lib/ceph/mon/ceph-a -- . > /tmp/tmp.ZjVEtHErVd 2026-03-31T20:41:56.060 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod 0666 /tmp/tmp.ZjVEtHErVd 2026-03-31T20:41:56.111 DEBUG:teuthology.orchestra.remote:vm03:/tmp/tmp.ZjVEtHErVd is 687KB 2026-03-31T20:41:56.160 DEBUG:teuthology.orchestra.run.vm03:> rm -fr /tmp/tmp.ZjVEtHErVd 2026-03-31T20:41:56.162 DEBUG:teuthology.misc:Transferring archived files from vm03:/var/lib/ceph/mon/ceph-b to /archive/kyr-2026-03-31_11:18:10-rados-tentacle-none-default-vps/4344/data/mon.b.tgz 2026-03-31T20:41:56.162 DEBUG:teuthology.orchestra.run.vm03:> mktemp 2026-03-31T20:41:56.210 INFO:teuthology.orchestra.run.vm03.stdout:/tmp/tmp.Ma1qwWrgLa 2026-03-31T20:41:56.210 DEBUG:teuthology.orchestra.run.vm03:> sudo tar cz -f - -C /var/lib/ceph/mon/ceph-b -- . > /tmp/tmp.Ma1qwWrgLa 2026-03-31T20:41:56.336 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod 0666 /tmp/tmp.Ma1qwWrgLa 2026-03-31T20:41:56.387 DEBUG:teuthology.orchestra.remote:vm03:/tmp/tmp.Ma1qwWrgLa is 704KB 2026-03-31T20:41:56.436 DEBUG:teuthology.orchestra.run.vm03:> rm -fr /tmp/tmp.Ma1qwWrgLa 2026-03-31T20:41:56.439 DEBUG:teuthology.misc:Transferring archived files from vm03:/var/lib/ceph/mon/ceph-c to /archive/kyr-2026-03-31_11:18:10-rados-tentacle-none-default-vps/4344/data/mon.c.tgz 2026-03-31T20:41:56.439 DEBUG:teuthology.orchestra.run.vm03:> mktemp 2026-03-31T20:41:56.482 INFO:teuthology.orchestra.run.vm03.stdout:/tmp/tmp.qpep11BjLo 2026-03-31T20:41:56.482 DEBUG:teuthology.orchestra.run.vm03:> sudo tar cz -f - -C /var/lib/ceph/mon/ceph-c -- . > /tmp/tmp.qpep11BjLo 2026-03-31T20:41:56.608 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod 0666 /tmp/tmp.qpep11BjLo 2026-03-31T20:41:56.659 DEBUG:teuthology.orchestra.remote:vm03:/tmp/tmp.qpep11BjLo is 704KB 2026-03-31T20:41:56.708 DEBUG:teuthology.orchestra.run.vm03:> rm -fr /tmp/tmp.qpep11BjLo 2026-03-31T20:41:56.710 INFO:tasks.ceph:Cleaning ceph cluster... 2026-03-31T20:41:56.710 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -rf -- /etc/ceph/ceph.conf /etc/ceph/ceph.keyring /home/ubuntu/cephtest/ceph.data /home/ubuntu/cephtest/ceph.monmap /home/ubuntu/cephtest/../*.pid 2026-03-31T20:41:56.806 INFO:teuthology.util.scanner:summary_data or yaml_file is empty! 2026-03-31T20:41:56.806 INFO:tasks.ceph:Archiving crash dumps... 2026-03-31T20:41:56.806 DEBUG:teuthology.misc:Transferring archived files from vm03:/var/lib/ceph/crash to /archive/kyr-2026-03-31_11:18:10-rados-tentacle-none-default-vps/4344/remote/vm03/crash 2026-03-31T20:41:56.806 DEBUG:teuthology.orchestra.run.vm03:> sudo tar c -f - -C /var/lib/ceph/crash -- . 2026-03-31T20:41:56.853 INFO:tasks.ceph:Compressing logs... 2026-03-31T20:41:56.853 DEBUG:teuthology.orchestra.run.vm03:> time sudo find /var/log/ceph -name '*.log' -print0 | sudo xargs --max-args=1 --max-procs=0 --verbose -0 --no-run-if-empty -- gzip -5 --verbose -- 2026-03-31T20:41:56.901 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25965.log 2026-03-31T20:41:56.902 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36297.log 2026-03-31T20:41:56.902 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55748.log 2026-03-31T20:41:56.902 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.25965.log: /var/log/ceph/ceph-client.admin.36297.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-rw.33314.log 2026-03-31T20:41:56.902 INFO:teuthology.orchestra.run.vm03.stderr: -- replaced with /var/log/ceph/ceph-client.admin.25965.log.gz 2026-03-31T20:41:56.902 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36297.log.gz 2026-03-31T20:41:56.902 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55748.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50041.log 2026-03-31T20:41:56.902 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55748.log.gz 2026-03-31T20:41:56.903 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68651.log 2026-03-31T20:41:56.903 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-rw.33314.log: /var/log/ceph/ceph-client.admin.50041.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-rw.33314.log.gz 2026-03-31T20:41:56.903 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50041.log.gz 2026-03-31T20:41:56.903 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69633.log 2026-03-31T20:41:56.903 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68651.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37402.log 2026-03-31T20:41:56.903 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68651.log.gz 2026-03-31T20:41:56.903 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69633.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69633.log.gz 2026-03-31T20:41:56.903 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29714.log 2026-03-31T20:41:56.904 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37402.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37824.log 2026-03-31T20:41:56.904 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37402.log.gz 2026-03-31T20:41:56.904 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29714.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59104.log 2026-03-31T20:41:56.904 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29714.log.gz 2026-03-31T20:41:56.904 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37824.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29203.log 2026-03-31T20:41:56.904 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37824.log.gz 2026-03-31T20:41:56.904 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59104.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63125.log 2026-03-31T20:41:56.904 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59104.log.gz 2026-03-31T20:41:56.904 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29203.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69462.log 2026-03-31T20:41:56.905 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29203.log.gz 2026-03-31T20:41:56.905 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63125.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.63125.log.gz -5 2026-03-31T20:41:56.905 INFO:teuthology.orchestra.run.vm03.stderr: --verbose -- /var/log/ceph/ceph-client.admin.70116.log 2026-03-31T20:41:56.905 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69462.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70644.log 2026-03-31T20:41:56.905 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69462.log.gz 2026-03-31T20:41:56.905 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28024.log 2026-03-31T20:41:56.905 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70116.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70116.log.gz 2026-03-31T20:41:56.905 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70644.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43315.log 2026-03-31T20:41:56.905 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70644.log.gz 2026-03-31T20:41:56.906 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28024.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28477.log 2026-03-31T20:41:56.906 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28024.log.gz 2026-03-31T20:41:56.906 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40227.log 2026-03-31T20:41:56.906 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43315.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43315.log.gz 2026-03-31T20:41:56.906 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28477.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62466.log 2026-03-31T20:41:56.906 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28477.log.gz 2026-03-31T20:41:56.906 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40227.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66666.log 2026-03-31T20:41:56.906 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40227.log.gz 2026-03-31T20:41:56.906 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37546.log 2026-03-31T20:41:56.907 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62466.log: /var/log/ceph/ceph-client.admin.66666.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62466.log.gz 2026-03-31T20:41:56.907 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66666.log.gz 2026-03-31T20:41:56.907 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39726.log 2026-03-31T20:41:56.907 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37546.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62114.log 2026-03-31T20:41:56.907 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37546.log.gz 2026-03-31T20:41:56.907 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39726.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39052.log 2026-03-31T20:41:56.907 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39726.log.gz 2026-03-31T20:41:56.907 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62114.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66003.log 2026-03-31T20:41:56.907 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62114.log.gz 2026-03-31T20:41:56.908 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39052.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36752.log 2026-03-31T20:41:56.908 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39052.log.gz 2026-03-31T20:41:56.908 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66003.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70522.log 2026-03-31T20:41:56.908 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66003.log.gz 2026-03-31T20:41:56.908 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36752.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34410.log 2026-03-31T20:41:56.908 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36752.log.gz 2026-03-31T20:41:56.908 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70522.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44639.log 2026-03-31T20:41:56.908 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70522.log.gz 2026-03-31T20:41:56.909 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34410.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34410.log.gzgzip 2026-03-31T20:41:56.909 INFO:teuthology.orchestra.run.vm03.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.38739.log 2026-03-31T20:41:56.909 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44639.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30183.log 2026-03-31T20:41:56.909 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44639.log.gz 2026-03-31T20:41:56.909 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70691.log 2026-03-31T20:41:56.909 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38739.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38739.log.gz 2026-03-31T20:41:56.909 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30183.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.30183.log.gz 2026-03-31T20:41:56.909 INFO:teuthology.orchestra.run.vm03.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.66472.log 2026-03-31T20:41:56.909 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70691.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61272.log 2026-03-31T20:41:56.909 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70691.log.gz 2026-03-31T20:41:56.910 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28759.log 2026-03-31T20:41:56.910 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66472.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66472.log.gz 2026-03-31T20:41:56.910 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61272.log: gzip -5 --verbose -- 0.0% /var/log/ceph/ceph-client.admin.44989.log 2026-03-31T20:41:56.910 INFO:teuthology.orchestra.run.vm03.stderr: -- replaced with /var/log/ceph/ceph-client.admin.61272.log.gz 2026-03-31T20:41:56.910 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28759.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28759.log.gz 2026-03-31T20:41:56.910 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60805.log 2026-03-31T20:41:56.910 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44989.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49725.log 2026-03-31T20:41:56.910 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44989.log.gz 2026-03-31T20:41:56.911 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60805.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37921.log 2026-03-31T20:41:56.911 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60805.log.gz 2026-03-31T20:41:56.911 INFO:teuthology.orchestra.run.vm03.stderr:gzip/var/log/ceph/ceph-client.admin.49725.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.54812.log 2026-03-31T20:41:56.911 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49725.log.gz 2026-03-31T20:41:56.911 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph.audit.log 2026-03-31T20:41:56.911 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37921.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37921.log.gz 2026-03-31T20:41:56.911 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54812.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54812.log.gz -5 2026-03-31T20:41:56.911 INFO:teuthology.orchestra.run.vm03.stderr: --verbose -- /var/log/ceph/ceph-client.admin.25887.log 2026-03-31T20:41:56.911 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph.audit.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44945.log 2026-03-31T20:41:56.912 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.25887.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42450.log 2026-03-31T20:41:56.912 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25887.log.gz 2026-03-31T20:41:56.912 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44945.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44148.log 2026-03-31T20:41:56.912 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44945.log.gz 2026-03-31T20:41:56.912 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71030.log 2026-03-31T20:41:56.912 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42450.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42450.log.gz 2026-03-31T20:41:56.912 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71138.log 2026-03-31T20:41:56.912 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44148.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44148.log.gz 2026-03-31T20:41:56.913 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.71030.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71030.log.gz 2026-03-31T20:41:56.913 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61547.log 2026-03-31T20:41:56.913 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39579.log 2026-03-31T20:41:56.913 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.71138.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71138.log.gz 2026-03-31T20:41:56.913 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67698.log 2026-03-31T20:41:56.913 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61547.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61547.log.gz 2026-03-31T20:41:56.913 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39579.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39579.log.gz 2026-03-31T20:41:56.913 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27269.log 2026-03-31T20:41:56.913 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70425.log 2026-03-31T20:41:56.914 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67698.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67698.log.gz 2026-03-31T20:41:56.914 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59632.log 2026-03-31T20:41:56.914 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27269.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27269.log.gz 2026-03-31T20:41:56.914 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70425.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70425.log.gz 2026-03-31T20:41:56.914 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43718.log 2026-03-31T20:41:56.914 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67288.log 2026-03-31T20:41:56.914 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59632.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63915.log 2026-03-31T20:41:56.914 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43718.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59632.log.gz 2026-03-31T20:41:56.914 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43718.log.gz/var/log/ceph/ceph-client.admin.67288.log: 2026-03-31T20:41:56.915 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67288.log.gz 2026-03-31T20:41:56.915 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44099.log 2026-03-31T20:41:56.915 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67215.log 2026-03-31T20:41:56.915 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63915.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63915.log.gz 2026-03-31T20:41:56.915 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71379.log 2026-03-31T20:41:56.915 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44099.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44099.log.gz 2026-03-31T20:41:56.915 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67215.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67215.log.gz 2026-03-31T20:41:56.915 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33889.log 2026-03-31T20:41:56.915 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67570.log 2026-03-31T20:41:56.916 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.71379.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53782.log 2026-03-31T20:41:56.916 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.33889.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71379.log.gz 2026-03-31T20:41:56.916 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33889.log.gz 2026-03-31T20:41:56.916 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67570.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67570.log.gz 2026-03-31T20:41:56.916 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26114.log 2026-03-31T20:41:56.916 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68767.log 2026-03-31T20:41:56.916 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53782.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53782.log.gz 2026-03-31T20:41:56.916 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44171.log 2026-03-31T20:41:56.917 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26114.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26114.log.gz 2026-03-31T20:41:56.917 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68767.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68767.log.gz 2026-03-31T20:41:56.917 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70859.log 2026-03-31T20:41:56.917 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38764.log 2026-03-31T20:41:56.917 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44171.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44171.log.gz 2026-03-31T20:41:56.917 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63842.log 2026-03-31T20:41:56.917 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70859.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70859.log.gz 2026-03-31T20:41:56.917 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38764.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38764.log.gz 2026-03-31T20:41:56.917 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52462.log 2026-03-31T20:41:56.917 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67083.log 2026-03-31T20:41:56.918 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63842.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63842.log.gz 2026-03-31T20:41:56.918 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34959.log 2026-03-31T20:41:56.918 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52462.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52462.log.gz 2026-03-31T20:41:56.918 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67083.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67083.log.gz 2026-03-31T20:41:56.918 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56854.log 2026-03-31T20:41:56.918 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64909.log 2026-03-31T20:41:56.918 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34959.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34959.log.gz 2026-03-31T20:41:56.918 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47727.log 2026-03-31T20:41:56.919 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56854.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56854.log.gz 2026-03-31T20:41:56.919 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64909.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64909.log.gz 2026-03-31T20:41:56.919 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38132.log 2026-03-31T20:41:56.919 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61372.log 2026-03-31T20:41:56.919 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47727.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47727.log.gz 2026-03-31T20:41:56.919 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32090.log 2026-03-31T20:41:56.919 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38132.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38132.log.gz 2026-03-31T20:41:56.919 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61372.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61372.log.gz 2026-03-31T20:41:56.919 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35053.log 2026-03-31T20:41:56.920 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63101.log 2026-03-31T20:41:56.920 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32090.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32090.log.gz 2026-03-31T20:41:56.920 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30464.log 2026-03-31T20:41:56.920 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35053.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35053.log.gz 2026-03-31T20:41:56.920 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63101.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63101.log.gz 2026-03-31T20:41:56.920 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31123.log 2026-03-31T20:41:56.920 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41764.log 2026-03-31T20:41:56.920 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30464.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30464.log.gz 2026-03-31T20:41:56.920 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27092.log 2026-03-31T20:41:56.921 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31123.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31123.log.gz 2026-03-31T20:41:56.921 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41764.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41764.log.gz 2026-03-31T20:41:56.921 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53547.log 2026-03-31T20:41:56.921 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35585.log 2026-03-31T20:41:56.921 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27092.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27092.log.gz 2026-03-31T20:41:56.921 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37971.log 2026-03-31T20:41:56.921 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53547.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53547.log.gz 2026-03-31T20:41:56.921 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35585.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35585.log.gz 2026-03-31T20:41:56.921 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56926.log 2026-03-31T20:41:56.921 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49031.log 2026-03-31T20:41:56.922 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37971.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37971.log.gz 2026-03-31T20:41:56.922 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35313.log 2026-03-31T20:41:56.922 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56926.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56926.log.gz 2026-03-31T20:41:56.922 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49031.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49031.log.gz 2026-03-31T20:41:56.922 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42173.log 2026-03-31T20:41:56.922 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52614.log 2026-03-31T20:41:56.922 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35313.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35313.log.gz 2026-03-31T20:41:56.922 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52664.log 2026-03-31T20:41:56.923 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42173.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42173.log.gz 2026-03-31T20:41:56.923 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52614.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41608.log 2026-03-31T20:41:56.923 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52614.log.gz 2026-03-31T20:41:56.923 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49557.log 2026-03-31T20:41:56.923 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52664.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52664.log.gz 2026-03-31T20:41:56.923 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53059.log 2026-03-31T20:41:56.923 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41608.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41608.log.gz 2026-03-31T20:41:56.923 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49557.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49557.log.gz 2026-03-31T20:41:56.923 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78909.log 2026-03-31T20:41:56.924 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65074.log 2026-03-31T20:41:56.924 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53059.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53059.log.gz 2026-03-31T20:41:56.924 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78909.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56640.log 2026-03-31T20:41:56.924 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78909.log.gz 2026-03-31T20:41:56.924 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65074.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65074.log.gz 2026-03-31T20:41:56.924 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57043.log 2026-03-31T20:41:56.924 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40895.log 2026-03-31T20:41:56.924 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56640.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56640.log.gz 2026-03-31T20:41:56.924 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45481.log 2026-03-31T20:41:56.925 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57043.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57043.log.gz 2026-03-31T20:41:56.925 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40895.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40895.log.gz -5 2026-03-31T20:41:56.925 INFO:teuthology.orchestra.run.vm03.stderr: --verbose -- /var/log/ceph/ceph-client.admin.69949.log 2026-03-31T20:41:56.925 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65030.log 2026-03-31T20:41:56.925 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45481.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45481.log.gz 2026-03-31T20:41:56.925 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61776.log 2026-03-31T20:41:56.925 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69949.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69949.log.gz 2026-03-31T20:41:56.925 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65030.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65030.log.gz 2026-03-31T20:41:56.925 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36800.log 2026-03-31T20:41:56.926 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-rd.33532.log 2026-03-31T20:41:56.926 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61776.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61776.log.gz 2026-03-31T20:41:56.926 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30061.log 2026-03-31T20:41:56.926 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36800.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36800.log.gz 2026-03-31T20:41:56.926 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-rd.33532.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-rd.33532.log.gz 2026-03-31T20:41:56.926 INFO:teuthology.orchestra.run.vm03.stderr: 96.2% -- replaced with /var/log/ceph/ceph.audit.log.gz 2026-03-31T20:41:56.926 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51913.log 2026-03-31T20:41:56.926 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39118.log 2026-03-31T20:41:56.926 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30061.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30061.log.gz 2026-03-31T20:41:56.926 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36514.log 2026-03-31T20:41:56.927 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.51913.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51913.log.gz 2026-03-31T20:41:56.927 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39118.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39118.log.gz -5 2026-03-31T20:41:56.927 INFO:teuthology.orchestra.run.vm03.stderr: --verbose -- /var/log/ceph/ceph-client.admin.39074.log 2026-03-31T20:41:56.927 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36514.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38304.log 2026-03-31T20:41:56.927 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36514.log.gz 2026-03-31T20:41:56.927 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66612.log 2026-03-31T20:41:56.927 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39074.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39074.log.gz 2026-03-31T20:41:56.927 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38304.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28595.log 2026-03-31T20:41:56.927 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38304.log.gz 2026-03-31T20:41:56.928 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66612.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62202.log 2026-03-31T20:41:56.928 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66612.log.gz 2026-03-31T20:41:56.928 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56325.log 2026-03-31T20:41:56.928 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28595.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28595.log.gz 2026-03-31T20:41:56.928 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62202.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27118.log 2026-03-31T20:41:56.928 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62202.log.gz 2026-03-31T20:41:56.928 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56325.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62664.log 2026-03-31T20:41:56.928 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56325.log.gz 2026-03-31T20:41:56.928 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54424.log 2026-03-31T20:41:56.928 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27118.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27118.log.gz 2026-03-31T20:41:56.929 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62664.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60068.log 2026-03-31T20:41:56.929 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62664.log.gz 2026-03-31T20:41:56.929 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54424.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78076.log 2026-03-31T20:41:56.929 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54424.log.gz 2026-03-31T20:41:56.929 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60712.log 2026-03-31T20:41:56.929 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60068.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60068.log.gz 2026-03-31T20:41:56.929 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78076.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78675.log 2026-03-31T20:41:56.929 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78076.log.gz 2026-03-31T20:41:56.929 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60712.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41558.log 2026-03-31T20:41:56.929 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60712.log.gz 2026-03-31T20:41:56.930 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64717.log 2026-03-31T20:41:56.930 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78675.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78675.log.gz 2026-03-31T20:41:56.930 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41816.log 2026-03-31T20:41:56.930 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41558.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41558.log.gz 2026-03-31T20:41:56.930 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64717.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44879.log 2026-03-31T20:41:56.930 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64717.log.gz 2026-03-31T20:41:56.930 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41816.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50711.log 2026-03-31T20:41:56.930 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41816.log.gz 2026-03-31T20:41:56.930 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34828.log 2026-03-31T20:41:56.931 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44879.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44879.log.gz 2026-03-31T20:41:56.931 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.50711.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46703.log 2026-03-31T20:41:56.931 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50711.log.gz 2026-03-31T20:41:56.931 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34828.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27343.log 2026-03-31T20:41:56.931 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34828.log.gz 2026-03-31T20:41:56.931 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59414.log 2026-03-31T20:41:56.931 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46703.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46703.log.gz 2026-03-31T20:41:56.931 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27343.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63633.log 2026-03-31T20:41:56.931 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27343.log.gz 2026-03-31T20:41:56.932 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59414.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29762.log 2026-03-31T20:41:56.932 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59414.log.gz 2026-03-31T20:41:56.932 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63633.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63633.log.gz 2026-03-31T20:41:56.932 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44687.log 2026-03-31T20:41:56.932 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35660.log 2026-03-31T20:41:56.932 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29762.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29762.log.gz 2026-03-31T20:41:56.932 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43339.log 2026-03-31T20:41:56.933 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44687.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44687.log.gz 2026-03-31T20:41:56.933 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35660.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43117.log 2026-03-31T20:41:56.933 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35660.log.gz 2026-03-31T20:41:56.933 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43339.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68439.log 2026-03-31T20:41:56.933 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43339.log.gz 2026-03-31T20:41:56.933 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32676.log 2026-03-31T20:41:56.933 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43117.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43117.log.gz 2026-03-31T20:41:56.933 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68439.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin2.32700.log 2026-03-31T20:41:56.933 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68439.log.gz 2026-03-31T20:41:56.933 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32676.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43931.log 2026-03-31T20:41:56.934 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32676.log.gz 2026-03-31T20:41:56.934 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63587.log 2026-03-31T20:41:56.934 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin2.32700.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin2.32700.log.gz 2026-03-31T20:41:56.934 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57285.log 2026-03-31T20:41:56.934 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43931.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43931.log.gz 2026-03-31T20:41:56.934 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63587.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31612.log 2026-03-31T20:41:56.934 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63587.log.gz 2026-03-31T20:41:56.934 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57285.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65201.log 2026-03-31T20:41:56.934 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57285.log.gz 2026-03-31T20:41:56.935 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31612.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31411.log 2026-03-31T20:41:56.935 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31612.log.gz 2026-03-31T20:41:56.935 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65201.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-rd.33460.log 2026-03-31T20:41:56.935 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65201.log.gz 2026-03-31T20:41:56.935 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31411.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58858.log 2026-03-31T20:41:56.935 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31411.log.gz 2026-03-31T20:41:56.935 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38715.log 2026-03-31T20:41:56.935 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-rd.33460.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-rd.33460.log.gz 2026-03-31T20:41:56.935 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58858.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45180.log 2026-03-31T20:41:56.935 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58858.log.gz 2026-03-31T20:41:56.936 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38715.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38715.log.gz 2026-03-31T20:41:56.936 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64066.log 2026-03-31T20:41:56.936 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-mon.b.log 2026-03-31T20:41:56.936 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45180.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45180.log.gz 2026-03-31T20:41:56.936 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69972.log 2026-03-31T20:41:56.936 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64066.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64066.log.gz 2026-03-31T20:41:56.937 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-mon.b.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56136.log 2026-03-31T20:41:56.937 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69972.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31560.log 2026-03-31T20:41:56.937 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69972.log.gz 2026-03-31T20:41:56.937 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65881.log 2026-03-31T20:41:56.937 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56136.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56136.log.gz 2026-03-31T20:41:56.937 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41635.log 2026-03-31T20:41:56.937 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31560.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31560.log.gz 2026-03-31T20:41:56.937 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65881.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.65881.log.gz -5 2026-03-31T20:41:56.937 INFO:teuthology.orchestra.run.vm03.stderr: --verbose -- /var/log/ceph/ceph-client.admin.46185.log 2026-03-31T20:41:56.938 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62598.log 2026-03-31T20:41:56.938 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41635.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41635.log.gz 2026-03-31T20:41:56.938 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52226.log 2026-03-31T20:41:56.938 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46185.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46185.log.gz 2026-03-31T20:41:56.938 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62598.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62598.log.gz 2026-03-31T20:41:56.938 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69801.log 2026-03-31T20:41:56.938 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38328.log 2026-03-31T20:41:56.938 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52226.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52226.log.gz 2026-03-31T20:41:56.938 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54666.log 2026-03-31T20:41:56.939 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69801.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69801.log.gz 2026-03-31T20:41:56.939 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38328.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69171.log 2026-03-31T20:41:56.939 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38328.log.gz 2026-03-31T20:41:56.939 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54666.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62939.log 2026-03-31T20:41:56.939 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54666.log.gz 2026-03-31T20:41:56.939 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44443.log 2026-03-31T20:41:56.939 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69171.log: /var/log/ceph/ceph-client.admin.62939.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69171.log.gz 2026-03-31T20:41:56.939 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41584.log 2026-03-31T20:41:56.939 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62939.log.gz 2026-03-31T20:41:56.940 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77847.log 2026-03-31T20:41:56.940 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44443.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44443.log.gzgzip 2026-03-31T20:41:56.940 INFO:teuthology.orchestra.run.vm03.stderr: -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-rd.33484.log 2026-03-31T20:41:56.940 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41584.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41584.log.gz 2026-03-31T20:41:56.940 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.77847.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77847.log.gz 2026-03-31T20:41:56.940 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48314.log 2026-03-31T20:41:56.940 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65831.log 2026-03-31T20:41:56.941 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-rd.33484.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-rd.33484.log.gz 2026-03-31T20:41:56.941 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69438.log 2026-03-31T20:41:56.941 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48314.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48314.log.gz 2026-03-31T20:41:56.941 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65831.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42424.log 2026-03-31T20:41:56.941 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65831.log.gz 2026-03-31T20:41:56.941 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48098.log 2026-03-31T20:41:56.941 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69438.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69438.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64018.log 2026-03-31T20:41:56.941 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42424.log: 2026-03-31T20:41:56.941 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42424.log.gz 2026-03-31T20:41:56.942 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48098.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48098.log.gz 2026-03-31T20:41:56.942 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46208.log 2026-03-31T20:41:56.942 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66256.log 2026-03-31T20:41:56.942 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64018.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64018.log.gz 2026-03-31T20:41:56.942 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64159.log 2026-03-31T20:41:56.942 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46208.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46208.log.gz 2026-03-31T20:41:56.942 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66256.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48409.log 2026-03-31T20:41:56.942 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66256.log.gz 2026-03-31T20:41:56.942 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56567.log 2026-03-31T20:41:56.943 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64159.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64159.log.gz 2026-03-31T20:41:56.943 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62554.log 2026-03-31T20:41:56.943 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48409.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48409.log.gz 2026-03-31T20:41:56.943 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56567.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35337.log 2026-03-31T20:41:56.943 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56567.log.gz 2026-03-31T20:41:56.943 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64193.log 2026-03-31T20:41:56.943 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62554.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62554.log.gz 2026-03-31T20:41:56.943 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53952.log 2026-03-31T20:41:56.943 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35337.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35337.log.gz 2026-03-31T20:41:56.943 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64193.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64193.log.gz -5 --verbose -- /var/log/ceph/ceph-client.admin.49247.log 2026-03-31T20:41:56.943 INFO:teuthology.orchestra.run.vm03.stderr: 2026-03-31T20:41:56.944 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45082.log 2026-03-31T20:41:56.944 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53952.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53952.log.gz 2026-03-31T20:41:56.944 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67137.log 2026-03-31T20:41:56.944 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49247.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49247.log.gz 2026-03-31T20:41:56.944 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45082.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32207.log 2026-03-31T20:41:56.944 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45082.log.gz 2026-03-31T20:41:56.944 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30707.log 2026-03-31T20:41:56.944 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67137.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67137.log.gz 2026-03-31T20:41:56.945 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53854.log 2026-03-31T20:41:56.945 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32207.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32207.log.gz 2026-03-31T20:41:56.945 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30707.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64546.log 2026-03-31T20:41:56.945 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30707.log.gz 2026-03-31T20:41:56.945 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38886.log 2026-03-31T20:41:56.945 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53854.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53854.log.gz 2026-03-31T20:41:56.945 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60325.log 2026-03-31T20:41:56.945 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64546.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64546.log.gz 2026-03-31T20:41:56.945 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38886.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44783.log 2026-03-31T20:41:56.945 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38886.log.gz 2026-03-31T20:41:56.946 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36635.log 2026-03-31T20:41:56.946 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60325.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60325.log.gz 2026-03-31T20:41:56.946 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48655.log 2026-03-31T20:41:56.946 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44783.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44783.log.gz 2026-03-31T20:41:56.946 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36635.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62444.log 2026-03-31T20:41:56.946 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36635.log.gz 2026-03-31T20:41:56.946 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59487.log 2026-03-31T20:41:56.946 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48655.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48655.log.gz 2026-03-31T20:41:56.947 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39799.log 2026-03-31T20:41:56.947 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62444.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62444.log.gz 2026-03-31T20:41:56.947 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59487.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62576.log 2026-03-31T20:41:56.947 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59487.log.gz 2026-03-31T20:41:56.947 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37498.log 2026-03-31T20:41:56.947 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39799.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39799.log.gz 2026-03-31T20:41:56.947 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26962.log 2026-03-31T20:41:56.947 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62576.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62576.log.gz 2026-03-31T20:41:56.947 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37498.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37498.log.gz 2026-03-31T20:41:56.947 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26507.log 2026-03-31T20:41:56.948 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54735.log 2026-03-31T20:41:56.948 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26962.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26962.log.gz 2026-03-31T20:41:56.948 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57984.log 2026-03-31T20:41:56.948 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26507.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26507.log.gz 2026-03-31T20:41:56.948 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54735.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59366.log 2026-03-31T20:41:56.948 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54735.log.gz 2026-03-31T20:41:56.948 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58500.log 2026-03-31T20:41:56.948 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57984.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57984.log.gz 2026-03-31T20:41:56.949 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42810.log 2026-03-31T20:41:56.949 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59366.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59366.log.gz 2026-03-31T20:41:56.949 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58500.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31690.log 2026-03-31T20:41:56.949 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58500.log.gz 2026-03-31T20:41:56.949 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42810.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42810.log.gz 2026-03-31T20:41:56.949 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45650.log 2026-03-31T20:41:56.949 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52712.log 2026-03-31T20:41:56.950 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31690.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31690.log.gz 2026-03-31T20:41:56.950 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60447.log 2026-03-31T20:41:56.950 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45650.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45650.log.gz 2026-03-31T20:41:56.950 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52712.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46911.log 2026-03-31T20:41:56.950 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52712.log.gz 2026-03-31T20:41:56.950 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60447.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60447.log.gz 2026-03-31T20:41:56.950 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52410.log 2026-03-31T20:41:56.951 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53035.log 2026-03-31T20:41:56.951 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46911.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46911.log.gz 2026-03-31T20:41:56.951 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53734.log 2026-03-31T20:41:56.951 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52410.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52410.log.gz 2026-03-31T20:41:56.951 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53035.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44123.log 2026-03-31T20:41:56.951 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53035.log.gz 2026-03-31T20:41:56.951 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36030.log 2026-03-31T20:41:56.951 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53734.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53734.log.gz 2026-03-31T20:41:56.952 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65356.log 2026-03-31T20:41:56.952 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44123.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44123.log.gz 2026-03-31T20:41:56.952 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36030.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28853.log 2026-03-31T20:41:56.952 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36030.log.gz 2026-03-31T20:41:56.952 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32932.log 2026-03-31T20:41:56.952 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65356.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65356.log.gz 2026-03-31T20:41:56.952 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42886.log 2026-03-31T20:41:56.952 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28853.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28853.log.gz 2026-03-31T20:41:56.952 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32932.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43388.log 2026-03-31T20:41:56.952 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32932.log.gz 2026-03-31T20:41:56.953 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-ro.33149.log 2026-03-31T20:41:56.953 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42886.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42886.log.gz 2026-03-31T20:41:56.953 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54910.log 2026-03-31T20:41:56.953 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43388.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43388.log.gz 2026-03-31T20:41:56.953 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-ro.33149.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61322.log 2026-03-31T20:41:56.953 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-ro.33149.log.gz 2026-03-31T20:41:56.953 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38690.log 2026-03-31T20:41:56.953 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54910.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54910.log.gz 2026-03-31T20:41:56.953 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43167.log 2026-03-31T20:41:56.954 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61322.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61322.log.gz 2026-03-31T20:41:56.954 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38690.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47486.log 2026-03-31T20:41:56.954 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38690.log.gz 2026-03-31T20:41:56.954 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29617.log 2026-03-31T20:41:56.954 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43167.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43167.log.gz 2026-03-31T20:41:56.954 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45155.log 2026-03-31T20:41:56.954 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47486.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47486.log.gz 2026-03-31T20:41:56.954 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29617.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69899.log 2026-03-31T20:41:56.954 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29617.log.gz 2026-03-31T20:41:56.955 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42707.log 2026-03-31T20:41:56.955 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45155.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45155.log.gz 2026-03-31T20:41:56.955 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62070.log 2026-03-31T20:41:56.955 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69899.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69899.log.gz 2026-03-31T20:41:56.955 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42707.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37354.log 2026-03-31T20:41:56.955 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42707.log.gz 2026-03-31T20:41:56.955 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62070.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62070.log.gz 2026-03-31T20:41:56.956 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60374.log 2026-03-31T20:41:56.956 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37354.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37354.log.gz 2026-03-31T20:41:56.956 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65470.log 2026-03-31T20:41:56.956 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60374.log: gzip 0.0% -5 -- replaced with /var/log/ceph/ceph-client.admin.60374.log.gz --verbose 2026-03-31T20:41:56.956 INFO:teuthology.orchestra.run.vm03.stderr: -- /var/log/ceph/ceph-client.admin.39675.log 2026-03-31T20:41:56.956 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65470.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52875.log 2026-03-31T20:41:56.956 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65470.log.gz 2026-03-31T20:41:56.956 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26384.log 2026-03-31T20:41:56.957 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39675.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39675.log.gz 2026-03-31T20:41:56.957 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52875.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54373.log 2026-03-31T20:41:56.957 INFO:teuthology.orchestra.run.vm03.stderr: -- replaced with /var/log/ceph/ceph-client.admin.52875.log.gz 2026-03-31T20:41:56.957 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26384.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62730.log 2026-03-31T20:41:56.957 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26384.log.gz 2026-03-31T20:41:56.957 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46819.log 2026-03-31T20:41:56.957 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54373.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54373.log.gz 2026-03-31T20:41:56.957 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62730.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43215.log 2026-03-31T20:41:56.957 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62730.log.gz 2026-03-31T20:41:56.958 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46819.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30611.log 2026-03-31T20:41:56.958 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46819.log.gz 2026-03-31T20:41:56.958 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27070.log 2026-03-31T20:41:56.958 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43215.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43215.log.gz 2026-03-31T20:41:56.958 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30611.log: gzip -5 --verbose 0.0% -- /var/log/ceph/ceph-client.admin.38086.log -- replaced with /var/log/ceph/ceph-client.admin.30611.log.gz 2026-03-31T20:41:56.958 INFO:teuthology.orchestra.run.vm03.stderr: 2026-03-31T20:41:56.958 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27070.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46679.log 2026-03-31T20:41:56.958 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27070.log.gz 2026-03-31T20:41:56.958 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61673.log 2026-03-31T20:41:56.959 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38086.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38086.log.gz 2026-03-31T20:41:56.959 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46679.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36680.log 2026-03-31T20:41:56.959 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46679.log.gz 2026-03-31T20:41:56.959 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61673.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37996.log 2026-03-31T20:41:56.959 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61673.log.gz 2026-03-31T20:41:56.959 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48148.log 2026-03-31T20:41:56.959 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36680.log: /var/log/ceph/ceph-client.admin.37996.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36680.log.gz 2026-03-31T20:41:56.959 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37996.log.gz 2026-03-31T20:41:56.959 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35265.log 2026-03-31T20:41:56.960 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48148.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67968.log 2026-03-31T20:41:56.960 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48148.log.gz 2026-03-31T20:41:56.960 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35265.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27319.log 2026-03-31T20:41:56.960 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35265.log.gz 2026-03-31T20:41:56.960 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42913.log 2026-03-31T20:41:56.960 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67968.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67968.log.gz 2026-03-31T20:41:56.960 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32232.log 2026-03-31T20:41:56.960 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27319.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27319.log.gz 2026-03-31T20:41:56.961 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42913.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42913.log.gz 2026-03-31T20:41:56.961 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60278.log 2026-03-31T20:41:56.961 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38107.log 2026-03-31T20:41:56.961 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32232.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32232.log.gz 2026-03-31T20:41:56.961 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66233.log 2026-03-31T20:41:56.961 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60278.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60278.log.gz 2026-03-31T20:41:56.961 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38107.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40921.log 2026-03-31T20:41:56.961 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38107.log.gz 2026-03-31T20:41:56.961 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31387.log 2026-03-31T20:41:56.962 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66233.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66233.log.gz 2026-03-31T20:41:56.962 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47680.log 2026-03-31T20:41:56.962 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40921.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40921.log.gz 2026-03-31T20:41:56.962 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31387.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69708.log 2026-03-31T20:41:56.962 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31387.log.gz 2026-03-31T20:41:56.962 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69388.log 2026-03-31T20:41:56.962 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47680.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47680.log.gz 2026-03-31T20:41:56.962 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44050.log 2026-03-31T20:41:56.962 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69708.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69708.log.gz 2026-03-31T20:41:56.962 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69388.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64959.log 2026-03-31T20:41:56.962 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69388.log.gz 2026-03-31T20:41:56.963 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38255.log 2026-03-31T20:41:56.963 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44050.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44050.log.gz 2026-03-31T20:41:56.963 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60854.log 2026-03-31T20:41:56.963 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64959.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64959.log.gz 2026-03-31T20:41:56.963 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38255.log: gzip 0.0% -5 -- replaced with /var/log/ceph/ceph-client.admin.38255.log.gz --verbose 2026-03-31T20:41:56.963 INFO:teuthology.orchestra.run.vm03.stderr: -- /var/log/ceph/ceph-client.admin.45922.log 2026-03-31T20:41:56.963 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46959.log 2026-03-31T20:41:56.964 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60854.log: /var/log/ceph/ceph-client.admin.45922.log: /var/log/ceph/ceph-client.admin.46959.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45922.log.gz 2026-03-31T20:41:56.964 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46959.log.gz 2026-03-31T20:41:56.964 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49992.log 2026-03-31T20:41:56.964 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60854.log.gz 2026-03-31T20:41:56.964 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34435.log 2026-03-31T20:41:56.964 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49992.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26309.log 2026-03-31T20:41:56.964 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49992.log.gz 2026-03-31T20:41:56.965 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34435.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67514.log 2026-03-31T20:41:56.965 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34435.log.gz 2026-03-31T20:41:56.965 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58530.log 2026-03-31T20:41:56.965 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26309.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26309.log.gz 2026-03-31T20:41:56.965 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67514.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.67514.log.gz -5 2026-03-31T20:41:56.965 INFO:teuthology.orchestra.run.vm03.stderr: --verbose -- /var/log/ceph/ceph-client.admin.67456.log 2026-03-31T20:41:56.965 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58530.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69996.log 2026-03-31T20:41:56.965 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58530.log.gz 2026-03-31T20:41:56.966 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28547.log 2026-03-31T20:41:56.966 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67456.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67456.log.gz 2026-03-31T20:41:56.966 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69996.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39925.log 2026-03-31T20:41:56.966 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69996.log.gz 2026-03-31T20:41:56.966 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28547.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-rw.33362.log 2026-03-31T20:41:56.966 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28547.log.gz 2026-03-31T20:41:56.966 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31916.log 2026-03-31T20:41:56.966 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39925.log: /var/log/ceph/ceph-client.xx-profile-rw.33362.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39925.log.gz 2026-03-31T20:41:56.966 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-rw.33362.log.gz 2026-03-31T20:41:56.967 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29689.log 2026-03-31T20:41:56.967 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31916.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58448.log 2026-03-31T20:41:56.967 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31916.log.gz 2026-03-31T20:41:56.967 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29689.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38566.log 2026-03-31T20:41:56.967 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29689.log.gz 2026-03-31T20:41:56.967 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50659.log 2026-03-31T20:41:56.967 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58448.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58448.log.gz 2026-03-31T20:41:56.967 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49773.log 2026-03-31T20:41:56.968 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38566.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38566.log.gz 2026-03-31T20:41:56.968 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.50659.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50659.log.gz 2026-03-31T20:41:56.968 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38959.log 2026-03-31T20:41:56.968 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61172.log 2026-03-31T20:41:56.968 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49773.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49773.log.gz 2026-03-31T20:41:56.968 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64798.log 2026-03-31T20:41:56.968 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38959.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38959.log.gz 2026-03-31T20:41:56.969 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61172.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61172.log.gz 2026-03-31T20:41:56.969 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32556.log 2026-03-31T20:41:56.969 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70763.log 2026-03-31T20:41:56.969 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64798.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64798.log.gz 2026-03-31T20:41:56.969 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68391.log 2026-03-31T20:41:56.970 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32556.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32556.log.gz 2026-03-31T20:41:56.970 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70763.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70763.log.gz 2026-03-31T20:41:56.970 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40457.log 2026-03-31T20:41:56.970 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35610.log 2026-03-31T20:41:56.970 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68391.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68391.log.gz 2026-03-31T20:41:56.970 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56180.log 2026-03-31T20:41:56.970 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40457.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40457.log.gz 2026-03-31T20:41:56.970 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35610.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35610.log.gz 2026-03-31T20:41:56.970 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30587.log 2026-03-31T20:41:56.971 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56013.log 2026-03-31T20:41:56.971 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56180.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56180.log.gz 2026-03-31T20:41:56.971 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38983.log 2026-03-31T20:41:56.971 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30587.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30587.log.gz 2026-03-31T20:41:56.971 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56013.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56013.log.gz 2026-03-31T20:41:56.971 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36470.log 2026-03-31T20:41:56.971 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43743.log 2026-03-31T20:41:56.972 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38983.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38983.log.gz 2026-03-31T20:41:56.972 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36470.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62180.log 2026-03-31T20:41:56.972 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36470.log.gz 2026-03-31T20:41:56.972 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43743.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38373.log 2026-03-31T20:41:56.972 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43743.log.gz 2026-03-31T20:41:56.972 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30232.log 2026-03-31T20:41:56.972 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62180.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62180.log.gz 2026-03-31T20:41:56.972 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48935.log 2026-03-31T20:41:56.973 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38373.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38373.log.gz 2026-03-31T20:41:56.973 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30232.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31638.log 2026-03-31T20:41:56.973 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30232.log.gz 2026-03-31T20:41:56.973 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53522.log 2026-03-31T20:41:56.973 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48935.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48935.log.gz 2026-03-31T20:41:56.973 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39164.log 2026-03-31T20:41:56.973 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31638.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31638.log.gz 2026-03-31T20:41:56.973 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53522.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53522.log.gz 2026-03-31T20:41:56.973 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54935.log 2026-03-31T20:41:56.974 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36659.log 2026-03-31T20:41:56.974 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39164.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39164.log.gz 2026-03-31T20:41:56.974 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30928.log 2026-03-31T20:41:56.974 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54935.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54935.log.gz 2026-03-31T20:41:56.974 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36659.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36659.log.gz 2026-03-31T20:41:56.974 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70188.log 2026-03-31T20:41:56.974 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41404.log 2026-03-31T20:41:56.974 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30928.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30928.log.gz 2026-03-31T20:41:56.975 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27219.log 2026-03-31T20:41:56.975 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70188.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70188.log.gz 2026-03-31T20:41:56.975 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41404.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41404.log.gz 2026-03-31T20:41:56.975 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68321.log 2026-03-31T20:41:56.975 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51208.log 2026-03-31T20:41:56.975 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27219.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27219.log.gz 2026-03-31T20:41:56.975 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46863.log 2026-03-31T20:41:56.975 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68321.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68321.log.gz 2026-03-31T20:41:56.975 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.51208.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51208.log.gz 2026-03-31T20:41:56.976 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66496.log 2026-03-31T20:41:56.976 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62708.log 2026-03-31T20:41:56.976 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46863.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46863.log.gz 2026-03-31T20:41:56.976 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40355.log 2026-03-31T20:41:56.976 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66496.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66496.log.gz 2026-03-31T20:41:56.976 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62708.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62708.log.gz 2026-03-31T20:41:56.976 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38463.log 2026-03-31T20:41:56.976 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61750.log 2026-03-31T20:41:56.977 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40355.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40355.log.gz 2026-03-31T20:41:56.977 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39873.log 2026-03-31T20:41:56.977 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38463.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38463.log.gz 2026-03-31T20:41:56.977 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61750.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61750.log.gz 2026-03-31T20:41:56.977 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68091.log 2026-03-31T20:41:56.977 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63305.log 2026-03-31T20:41:56.977 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39873.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39873.log.gz 2026-03-31T20:41:56.977 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42071.log 2026-03-31T20:41:56.978 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68091.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68091.log.gz 2026-03-31T20:41:56.978 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63305.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63305.log.gz 2026-03-31T20:41:56.978 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47412.log 2026-03-31T20:41:56.978 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57067.log 2026-03-31T20:41:56.978 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42071.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42071.log.gz 2026-03-31T20:41:56.978 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60619.log 2026-03-31T20:41:56.978 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47412.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47412.log.gz 2026-03-31T20:41:56.978 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57067.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57067.log.gz 2026-03-31T20:41:56.978 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40026.log 2026-03-31T20:41:56.979 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-rd.33555.log 2026-03-31T20:41:56.979 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60619.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60619.log.gz 2026-03-31T20:41:56.979 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40026.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52123.log 2026-03-31T20:41:56.979 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40026.log.gz 2026-03-31T20:41:56.979 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-rd.33555.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-rd.33555.log.gz 2026-03-31T20:41:56.979 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68581.log 2026-03-31T20:41:56.979 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48360.log 2026-03-31T20:41:56.980 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52123.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51496.log 2026-03-31T20:41:56.980 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68581.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52123.log.gz 2026-03-31T20:41:56.980 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68581.log.gz 2026-03-31T20:41:56.980 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48360.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48360.log.gz 2026-03-31T20:41:56.980 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30536.log 2026-03-31T20:41:56.980 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48959.log 2026-03-31T20:41:56.980 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.51496.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51496.log.gz 2026-03-31T20:41:56.980 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30536.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35786.log 2026-03-31T20:41:56.981 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30536.log.gz 2026-03-31T20:41:56.981 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48959.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53492.log 2026-03-31T20:41:56.981 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48959.log.gz 2026-03-31T20:41:56.981 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29882.log 2026-03-31T20:41:56.981 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35786.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35786.log.gz 2026-03-31T20:41:56.981 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27944.log 2026-03-31T20:41:56.981 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53492.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53492.log.gz 2026-03-31T20:41:56.981 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29882.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32839.log 2026-03-31T20:41:56.981 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29882.log.gz 2026-03-31T20:41:56.982 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27944.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27944.log.gz 2026-03-31T20:41:56.982 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69316.log 2026-03-31T20:41:56.982 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51260.log 2026-03-31T20:41:56.982 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32839.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32839.log.gz 2026-03-31T20:41:56.982 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69316.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66399.log 2026-03-31T20:41:56.982 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69316.log.gz 2026-03-31T20:41:56.983 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.51260.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64298.log 2026-03-31T20:41:56.983 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51260.log.gz 2026-03-31T20:41:56.983 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66399.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66399.log.gz 2026-03-31T20:41:56.983 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43292.log 2026-03-31T20:41:56.983 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27295.log 2026-03-31T20:41:56.983 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64298.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64298.log.gz 2026-03-31T20:41:56.984 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-rd.33694.log 2026-03-31T20:41:56.984 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43292.log: /var/log/ceph/ceph-client.admin.27295.log: 0.0% 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27295.log.gz 2026-03-31T20:41:56.984 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54027.log 2026-03-31T20:41:56.984 INFO:teuthology.orchestra.run.vm03.stderr: -- replaced with /var/log/ceph/ceph-client.admin.43292.log.gz 2026-03-31T20:41:56.984 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-rd.33694.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-rd.33694.log.gz 2026-03-31T20:41:56.984 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78408.log 2026-03-31T20:41:56.984 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67485.log 2026-03-31T20:41:56.985 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54027.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54027.log.gz 2026-03-31T20:41:56.985 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58555.log 2026-03-31T20:41:56.985 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78408.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78408.log.gz 2026-03-31T20:41:56.985 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67485.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43694.log 2026-03-31T20:41:56.985 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67485.log.gz 2026-03-31T20:41:56.985 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35385.log 2026-03-31T20:41:56.985 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58555.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58555.log.gz 2026-03-31T20:41:56.985 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35684.log 2026-03-31T20:41:56.986 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43694.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43694.log.gz 2026-03-31T20:41:56.986 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35385.log: gzip 0.0% -5 --verbose -- /var/log/ceph/ceph-client.admin.38065.log 2026-03-31T20:41:56.986 INFO:teuthology.orchestra.run.vm03.stderr: -- replaced with /var/log/ceph/ceph-client.admin.35385.log.gz 2026-03-31T20:41:56.986 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66326.log 2026-03-31T20:41:56.986 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35684.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35684.log.gz 2026-03-31T20:41:56.986 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-rw.33242.log 2026-03-31T20:41:56.986 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38065.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38065.log.gz 2026-03-31T20:41:56.986 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66326.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66326.log.gz 2026-03-31T20:41:56.986 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59269.log 2026-03-31T20:41:56.986 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70043.log 2026-03-31T20:41:56.987 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-rw.33242.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-rw.33242.log.gz 2026-03-31T20:41:56.987 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63193.log 2026-03-31T20:41:56.987 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59269.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59269.log.gz 2026-03-31T20:41:56.987 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70043.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.70043.log.gz -5 2026-03-31T20:41:56.987 INFO:teuthology.orchestra.run.vm03.stderr: --verbose -- /var/log/ceph/ceph-client.admin.48557.log 2026-03-31T20:41:56.987 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26359.log 2026-03-31T20:41:56.987 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63193.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58665.log 2026-03-31T20:41:56.987 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48557.log: -- replaced with /var/log/ceph/ceph-client.admin.63193.log.gz 2026-03-31T20:41:56.988 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48557.log.gz 2026-03-31T20:41:56.988 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26359.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26359.log.gz 2026-03-31T20:41:56.988 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39358.log 2026-03-31T20:41:56.988 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34511.log 2026-03-31T20:41:56.988 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58665.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58665.log.gz 2026-03-31T20:41:56.988 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39358.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33965.log 2026-03-31T20:41:56.988 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39358.log.gz 2026-03-31T20:41:56.988 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34511.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.34511.log.gz 2026-03-31T20:41:56.988 INFO:teuthology.orchestra.run.vm03.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.39008.log 2026-03-31T20:41:56.988 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35171.log 2026-03-31T20:41:56.989 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.33965.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33965.log.gz 2026-03-31T20:41:56.989 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36201.log 2026-03-31T20:41:56.989 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39008.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39008.log.gz 2026-03-31T20:41:56.989 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35171.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.35171.log.gz -5 2026-03-31T20:41:56.989 INFO:teuthology.orchestra.run.vm03.stderr: --verbose -- /var/log/ceph/ceph-client.admin.28877.log 2026-03-31T20:41:56.989 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26629.log 2026-03-31T20:41:56.989 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36201.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36201.log.gz 2026-03-31T20:41:56.989 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52639.log 2026-03-31T20:41:56.990 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28877.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28877.log.gz 2026-03-31T20:41:56.990 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26629.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26629.log.gz 2026-03-31T20:41:56.990 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-rw.33218.log 2026-03-31T20:41:56.990 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32630.log 2026-03-31T20:41:56.990 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52639.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52639.log.gz 2026-03-31T20:41:56.990 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34236.log 2026-03-31T20:41:56.990 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-rw.33218.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-rw.33218.log.gz 2026-03-31T20:41:56.990 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32630.log: gzip -5 --verbose -- 0.0% /var/log/ceph/ceph-client.admin.64691.log 2026-03-31T20:41:56.990 INFO:teuthology.orchestra.run.vm03.stderr: -- replaced with /var/log/ceph/ceph-client.admin.32630.log.gz 2026-03-31T20:41:56.990 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48292.log 2026-03-31T20:41:56.991 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34236.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34236.log.gz 2026-03-31T20:41:56.991 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63818.log 2026-03-31T20:41:56.991 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64691.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64691.log.gz 2026-03-31T20:41:56.991 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48292.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.48292.log.gz -5 2026-03-31T20:41:56.991 INFO:teuthology.orchestra.run.vm03.stderr: --verbose -- /var/log/ceph/ceph-client.admin.65402.log 2026-03-31T20:41:56.991 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63259.log 2026-03-31T20:41:56.991 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63818.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63818.log.gz 2026-03-31T20:41:56.991 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63215.log 2026-03-31T20:41:56.992 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65402.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65402.log.gz 2026-03-31T20:41:56.992 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63259.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63259.log.gz 2026-03-31T20:41:56.992 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59341.log 2026-03-31T20:41:56.992 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64934.log 2026-03-31T20:41:56.992 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63215.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63215.log.gz 2026-03-31T20:41:56.992 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45627.log 2026-03-31T20:41:56.992 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59341.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59341.log.gz 2026-03-31T20:41:56.992 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64934.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64934.log.gz 2026-03-31T20:41:56.992 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69362.log 2026-03-31T20:41:56.992 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63055.log 2026-03-31T20:41:56.993 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45627.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45627.log.gz 2026-03-31T20:41:56.993 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31891.log 2026-03-31T20:41:56.993 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69362.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69362.log.gz 2026-03-31T20:41:56.993 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63055.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63055.log.gz 2026-03-31T20:41:56.993 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38020.log 2026-03-31T20:41:56.993 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65150.log 2026-03-31T20:41:56.993 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31891.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31891.log.gz 2026-03-31T20:41:56.993 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40947.log 2026-03-31T20:41:56.994 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38020.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38020.log.gz 2026-03-31T20:41:56.994 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65150.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65150.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48508.log 2026-03-31T20:41:56.994 INFO:teuthology.orchestra.run.vm03.stderr: 2026-03-31T20:41:56.994 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27815.log 2026-03-31T20:41:56.994 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40947.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40947.log.gz 2026-03-31T20:41:56.994 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34780.log 2026-03-31T20:41:56.994 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48508.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48508.log.gz 2026-03-31T20:41:56.994 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27815.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27815.log.gz 2026-03-31T20:41:56.994 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47462.log 2026-03-31T20:41:56.994 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47048.log 2026-03-31T20:41:56.995 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34780.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34780.log.gz 2026-03-31T20:41:56.995 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-rd.33671.log 2026-03-31T20:41:56.995 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47462.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47462.log.gz 2026-03-31T20:41:56.995 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47048.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47048.log.gz 2026-03-31T20:41:56.995 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29009.log 2026-03-31T20:41:56.995 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36467.log 2026-03-31T20:41:56.995 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-rd.33671.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-rd.33671.log.gz 2026-03-31T20:41:56.995 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43667.log 2026-03-31T20:41:56.996 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29009.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29009.log.gz 2026-03-31T20:41:56.996 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36467.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36467.log.gz 2026-03-31T20:41:56.996 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62400.log 2026-03-31T20:41:56.996 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34067.log 2026-03-31T20:41:56.996 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43667.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43667.log.gz 2026-03-31T20:41:56.997 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64665.log 2026-03-31T20:41:56.997 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62400.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62400.log.gz 2026-03-31T20:41:56.997 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34067.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34067.log.gz 2026-03-31T20:41:56.997 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41378.log 2026-03-31T20:41:56.997 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62378.log 2026-03-31T20:41:56.997 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64665.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64665.log.gz 2026-03-31T20:41:56.997 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47003.log 2026-03-31T20:41:56.997 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41378.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41378.log.gz 2026-03-31T20:41:56.997 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62378.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62378.log.gz -5 --verbose -- /var/log/ceph/ceph-client.admin.64572.log 2026-03-31T20:41:56.997 INFO:teuthology.orchestra.run.vm03.stderr: 2026-03-31T20:41:56.998 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34356.log 2026-03-31T20:41:56.998 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47003.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47003.log.gz 2026-03-31T20:41:56.998 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65979.log 2026-03-31T20:41:56.998 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64572.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64572.log.gz 2026-03-31T20:41:56.998 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34356.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34356.log.gz 2026-03-31T20:41:56.998 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47994.log 2026-03-31T20:41:56.998 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62642.log 2026-03-31T20:41:56.998 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65979.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65979.log.gz 2026-03-31T20:41:56.998 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41153.log 2026-03-31T20:41:56.999 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47994.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47994.log.gz 2026-03-31T20:41:56.999 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62642.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62642.log.gz 2026-03-31T20:41:56.999 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40279.log 2026-03-31T20:41:56.999 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28687.log 2026-03-31T20:41:56.999 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41153.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41153.log.gz 2026-03-31T20:41:56.999 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56805.log 2026-03-31T20:41:56.999 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40279.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40279.log.gz 2026-03-31T20:41:56.999 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28687.log: 0.0%gzip -- replaced with /var/log/ceph/ceph-client.admin.28687.log.gz 2026-03-31T20:41:56.999 INFO:teuthology.orchestra.run.vm03.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.29740.log 2026-03-31T20:41:57.000 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71210.log 2026-03-31T20:41:57.000 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56805.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56805.log.gz 2026-03-31T20:41:57.000 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29521.log 2026-03-31T20:41:57.000 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29740.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29740.log.gz 2026-03-31T20:41:57.000 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.71210.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71210.log.gz 2026-03-31T20:41:57.000 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61222.log 2026-03-31T20:41:57.000 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42654.log 2026-03-31T20:41:57.000 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29521.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29521.log.gz 2026-03-31T20:41:57.000 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49055.log 2026-03-31T20:41:57.001 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61222.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61222.log.gz 2026-03-31T20:41:57.001 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42654.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42654.log.gz 2026-03-31T20:41:57.001 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62246.log 2026-03-31T20:41:57.001 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64643.log 2026-03-31T20:41:57.001 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49055.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49055.log.gz 2026-03-31T20:41:57.001 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27365.log 2026-03-31T20:41:57.001 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62246.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62246.log.gz 2026-03-31T20:41:57.001 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64643.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63329.log 2026-03-31T20:41:57.001 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64643.log.gz 2026-03-31T20:41:57.001 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39480.log 2026-03-31T20:41:57.002 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27365.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27365.log.gz 2026-03-31T20:41:57.002 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60594.log 2026-03-31T20:41:57.002 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63329.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63329.log.gz 2026-03-31T20:41:57.002 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39480.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39480.log.gz 2026-03-31T20:41:57.002 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28405.log 2026-03-31T20:41:57.002 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47267.log 2026-03-31T20:41:57.002 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60594.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60594.log.gz 2026-03-31T20:41:57.003 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28405.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27391.log 2026-03-31T20:41:57.003 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28405.log.gz 2026-03-31T20:41:57.003 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66210.log 2026-03-31T20:41:57.003 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47267.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47267.log.gz 2026-03-31T20:41:57.003 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64486.log 2026-03-31T20:41:57.003 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27391.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27391.log.gz 2026-03-31T20:41:57.003 INFO:teuthology.orchestra.run.vm03.stderr:gzip/var/log/ceph/ceph-client.admin.66210.log: -5 --verbose -- /var/log/ceph/ceph-client.admin.43467.log 2026-03-31T20:41:57.003 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66210.log.gz 2026-03-31T20:41:57.003 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64486.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48983.log 2026-03-31T20:41:57.003 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64486.log.gz 2026-03-31T20:41:57.004 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64521.log 2026-03-31T20:41:57.004 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43467.log: /var/log/ceph/ceph-client.admin.48983.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56735.log 2026-03-31T20:41:57.004 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48983.log.gz 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43467.log.gz 2026-03-31T20:41:57.004 INFO:teuthology.orchestra.run.vm03.stderr: 2026-03-31T20:41:57.004 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64521.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64521.log.gz 2026-03-31T20:41:57.004 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44004.log 2026-03-31T20:41:57.004 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34092.log 2026-03-31T20:41:57.004 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56735.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56735.log.gz 2026-03-31T20:41:57.005 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69093.log 2026-03-31T20:41:57.005 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44004.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44004.log.gz 2026-03-31T20:41:57.005 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34092.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34092.log.gz 2026-03-31T20:41:57.005 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32018.log 2026-03-31T20:41:57.005 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60570.log 2026-03-31T20:41:57.005 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69093.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69093.log.gz 2026-03-31T20:41:57.005 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29300.log 2026-03-31T20:41:57.005 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32018.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32018.log.gz 2026-03-31T20:41:57.005 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60570.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60570.log.gz 2026-03-31T20:41:57.005 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40305.log 2026-03-31T20:41:57.006 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49581.log 2026-03-31T20:41:57.006 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29300.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29300.log.gz 2026-03-31T20:41:57.006 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56781.log 2026-03-31T20:41:57.006 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40305.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40305.log.gz 2026-03-31T20:41:57.006 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49581.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49581.log.gz 2026-03-31T20:41:57.006 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-mgr.x.log 2026-03-31T20:41:57.006 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69824.log 2026-03-31T20:41:57.006 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56781.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56781.log.gz 2026-03-31T20:41:57.007 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71112.log 2026-03-31T20:41:57.007 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-mgr.x.log: /var/log/ceph/ceph-client.admin.69824.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69824.log.gz 2026-03-31T20:41:57.007 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49343.log 2026-03-31T20:41:57.007 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.71112.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71112.log.gz 2026-03-31T20:41:57.007 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-rd.33742.log 2026-03-31T20:41:57.008 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49343.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49343.log.gz 2026-03-31T20:41:57.008 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-rd.33436.log 2026-03-31T20:41:57.008 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58066.log 2026-03-31T20:41:57.008 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-rd.33742.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-rd.33742.log.gz 2026-03-31T20:41:57.008 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-rd.33436.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-rd.33436.log.gz 2026-03-31T20:41:57.008 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37282.log 2026-03-31T20:41:57.009 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54447.log 2026-03-31T20:41:57.009 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58066.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58066.log.gz 2026-03-31T20:41:57.009 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37282.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37282.log.gz 2026-03-31T20:41:57.009 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55437.log 2026-03-31T20:41:57.009 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44321.log 2026-03-31T20:41:57.009 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54447.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54447.log.gz 2026-03-31T20:41:57.009 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55437.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55437.log.gz 2026-03-31T20:41:57.010 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57418.log 2026-03-31T20:41:57.010 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50164.log 2026-03-31T20:41:57.010 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44321.log: /var/log/ceph/ceph-client.admin.57418.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44321.log.gz 2026-03-31T20:41:57.010 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57418.log.gz 2026-03-31T20:41:57.010 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47824.log 2026-03-31T20:41:57.010 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64883.log 2026-03-31T20:41:57.011 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.50164.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50164.log.gz 2026-03-31T20:41:57.011 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47824.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47824.log.gz 2026-03-31T20:41:57.011 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52809.log 2026-03-31T20:41:57.011 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53414.log 2026-03-31T20:41:57.011 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64883.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64883.log.gz 2026-03-31T20:41:57.011 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52809.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52809.log.gz 2026-03-31T20:41:57.011 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25888.log 2026-03-31T20:41:57.012 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41430.log 2026-03-31T20:41:57.012 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53414.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53414.log.gz 2026-03-31T20:41:57.012 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.25888.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25888.log.gz 2026-03-31T20:41:57.012 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28735.log 2026-03-31T20:41:57.012 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69755.log 2026-03-31T20:41:57.012 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41430.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41430.log.gz 2026-03-31T20:41:57.012 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28735.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28735.log.gz 2026-03-31T20:41:57.013 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37258.log 2026-03-31T20:41:57.013 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46162.log 2026-03-31T20:41:57.013 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69755.log: /var/log/ceph/ceph-client.admin.37258.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69755.log.gz 2026-03-31T20:41:57.013 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37258.log.gz 2026-03-31T20:41:57.013 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35907.log 2026-03-31T20:41:57.013 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62334.log 2026-03-31T20:41:57.014 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46162.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46162.log.gz 2026-03-31T20:41:57.014 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35907.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35907.log.gz 2026-03-31T20:41:57.014 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35097.log 2026-03-31T20:41:57.014 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61072.log 2026-03-31T20:41:57.014 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62334.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62334.log.gz 2026-03-31T20:41:57.014 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35097.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35097.log.gz 2026-03-31T20:41:57.014 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31004.log 2026-03-31T20:41:57.014 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59317.log 2026-03-31T20:41:57.015 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61072.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61072.log.gz 2026-03-31T20:41:57.015 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31004.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31004.log.gz 2026-03-31T20:41:57.015 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68836.log 2026-03-31T20:41:57.015 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49919.log 2026-03-31T20:41:57.015 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59317.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59317.log.gz 2026-03-31T20:41:57.015 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68836.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68836.log.gz 2026-03-31T20:41:57.015 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65053.log 2026-03-31T20:41:57.016 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57881.log 2026-03-31T20:41:57.016 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49919.log: /var/log/ceph/ceph-client.admin.65053.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49919.log.gz 2026-03-31T20:41:57.016 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65053.log.gz 2026-03-31T20:41:57.016 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71611.log 2026-03-31T20:41:57.016 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28983.log 2026-03-31T20:41:57.016 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57881.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57881.log.gz 2026-03-31T20:41:57.017 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.71611.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71611.log.gz 2026-03-31T20:41:57.017 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65307.log 2026-03-31T20:41:57.017 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50947.log 2026-03-31T20:41:57.017 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28983.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28983.log.gz 2026-03-31T20:41:57.017 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65307.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65307.log.gz 2026-03-31T20:41:57.017 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29470.log 2026-03-31T20:41:57.017 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34582.log 2026-03-31T20:41:57.018 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.50947.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50947.log.gz 2026-03-31T20:41:57.018 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29470.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29470.log.gz 2026-03-31T20:41:57.018 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57737.log 2026-03-31T20:41:57.018 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30952.log 2026-03-31T20:41:57.018 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34582.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34582.log.gz 2026-03-31T20:41:57.018 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57737.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57737.log.gz 2026-03-31T20:41:57.018 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35032.log 2026-03-31T20:41:57.019 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78264.log 2026-03-31T20:41:57.019 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30952.log: /var/log/ceph/ceph-client.admin.35032.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30952.log.gz 2026-03-31T20:41:57.019 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35032.log.gz 2026-03-31T20:41:57.019 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60112.log 2026-03-31T20:41:57.019 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27974.log 2026-03-31T20:41:57.019 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78264.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78264.log.gz 2026-03-31T20:41:57.020 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60112.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60112.log.gz 2026-03-31T20:41:57.020 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59462.log 2026-03-31T20:41:57.020 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68139.log 2026-03-31T20:41:57.020 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27974.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27974.log.gz 2026-03-31T20:41:57.021 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59462.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59462.log.gz 2026-03-31T20:41:57.021 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53150.log 2026-03-31T20:41:57.021 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62686.log 2026-03-31T20:41:57.021 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68139.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68139.log.gz 2026-03-31T20:41:57.021 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53150.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53150.log.gz 2026-03-31T20:41:57.021 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38438.log 2026-03-31T20:41:57.021 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50789.log 2026-03-31T20:41:57.022 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62686.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62686.log.gz 2026-03-31T20:41:57.022 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38438.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38438.log.gz 2026-03-31T20:41:57.022 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58640.log 2026-03-31T20:41:57.022 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin2.32753.log 2026-03-31T20:41:57.022 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.50789.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50789.log.gz 2026-03-31T20:41:57.022 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58640.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58640.log.gz 2026-03-31T20:41:57.022 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54519.log 2026-03-31T20:41:57.023 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44345.log 2026-03-31T20:41:57.023 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin2.32753.log: /var/log/ceph/ceph-client.admin.54519.log: 66.9% -- replaced with /var/log/ceph/ceph-client.admin2.32753.log.gz 2026-03-31T20:41:57.023 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54519.log.gz 2026-03-31T20:41:57.023 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71669.log 2026-03-31T20:41:57.023 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62890.log 2026-03-31T20:41:57.023 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44345.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44345.log.gz 2026-03-31T20:41:57.023 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.71669.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71669.log.gz 2026-03-31T20:41:57.024 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34560.log 2026-03-31T20:41:57.024 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50450.log 2026-03-31T20:41:57.024 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62890.log: /var/log/ceph/ceph-client.admin.34560.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62890.log.gz 2026-03-31T20:41:57.024 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34560.log.gz 2026-03-31T20:41:57.024 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59128.log 2026-03-31T20:41:57.024 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62136.log 2026-03-31T20:41:57.025 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.50450.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50450.log.gz 2026-03-31T20:41:57.025 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59128.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59128.log.gz 2026-03-31T20:41:57.025 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32957.log 2026-03-31T20:41:57.025 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29985.log 2026-03-31T20:41:57.025 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62136.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62136.log.gz 2026-03-31T20:41:57.025 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32957.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32957.log.gz 2026-03-31T20:41:57.025 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66448.log 2026-03-31T20:41:57.026 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71403.log 2026-03-31T20:41:57.026 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29985.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29985.log.gz 2026-03-31T20:41:57.026 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66448.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66448.log.gz 2026-03-31T20:41:57.026 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78342.log 2026-03-31T20:41:57.026 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27143.log 2026-03-31T20:41:57.026 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.71403.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71403.log.gz 2026-03-31T20:41:57.026 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78342.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78342.log.gz 2026-03-31T20:41:57.026 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36418.log 2026-03-31T20:41:57.027 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27535.log 2026-03-31T20:41:57.027 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27143.log: /var/log/ceph/ceph-client.admin.36418.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27143.log.gz 2026-03-31T20:41:57.027 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36418.log.gz 2026-03-31T20:41:57.027 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68743.log 2026-03-31T20:41:57.027 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-rd.33648.log 2026-03-31T20:41:57.027 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27535.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27535.log.gz 2026-03-31T20:41:57.028 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68743.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68743.log.gz 2026-03-31T20:41:57.028 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28345.log 2026-03-31T20:41:57.028 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48247.log 2026-03-31T20:41:57.028 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-rd.33648.log: /var/log/ceph/ceph-client.admin.28345.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-rd.33648.log.gz 2026-03-31T20:41:57.028 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28345.log.gz 2026-03-31T20:41:57.028 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28320.log 2026-03-31T20:41:57.028 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55699.log 2026-03-31T20:41:57.029 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48247.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48247.log.gz 2026-03-31T20:41:57.029 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28320.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28320.log.gz 2026-03-31T20:41:57.029 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66955.log 2026-03-31T20:41:57.029 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52488.log 2026-03-31T20:41:57.029 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55699.log: /var/log/ceph/ceph-client.admin.66955.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55699.log.gz 2026-03-31T20:41:57.029 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66955.log.gz 2026-03-31T20:41:57.030 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70352.log 2026-03-31T20:41:57.030 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46338.log 2026-03-31T20:41:57.030 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52488.log: /var/log/ceph/ceph-client.admin.70352.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52488.log.gz 2026-03-31T20:41:57.030 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70352.log.gz 2026-03-31T20:41:57.030 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61622.log 2026-03-31T20:41:57.030 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55341.log 2026-03-31T20:41:57.030 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46338.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46338.log.gz 2026-03-31T20:41:57.031 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61622.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61622.log.gz 2026-03-31T20:41:57.031 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41687.log 2026-03-31T20:41:57.031 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54544.log 2026-03-31T20:41:57.031 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55341.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55341.log.gz 2026-03-31T20:41:57.031 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41687.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41687.log.gz 2026-03-31T20:41:57.031 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29055.log 2026-03-31T20:41:57.032 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68463.log 2026-03-31T20:41:57.032 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29055.log: /var/log/ceph/ceph-client.admin.54544.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29055.log.gz 2026-03-31T20:41:57.032 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54544.log.gz 2026-03-31T20:41:57.032 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59729.log 2026-03-31T20:41:57.032 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48606.log 2026-03-31T20:41:57.032 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68463.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68463.log.gz 2026-03-31T20:41:57.032 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59729.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59729.log.gz 2026-03-31T20:41:57.033 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69272.log 2026-03-31T20:41:57.033 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63541.log 2026-03-31T20:41:57.033 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48606.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48606.log.gz 2026-03-31T20:41:57.033 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69272.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69272.log.gz 2026-03-31T20:41:57.033 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28170.log 2026-03-31T20:41:57.033 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62915.log 2026-03-31T20:41:57.033 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63541.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63541.log.gz 2026-03-31T20:41:57.034 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28170.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28170.log.gz 2026-03-31T20:41:57.034 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-rd.33625.log 2026-03-31T20:41:57.034 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68997.log 2026-03-31T20:41:57.034 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62915.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62915.log.gz 2026-03-31T20:41:57.034 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-rd.33625.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-rd.33625.log.gz 2026-03-31T20:41:57.034 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58766.log 2026-03-31T20:41:57.034 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30134.log 2026-03-31T20:41:57.035 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68997.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68997.log.gz 2026-03-31T20:41:57.035 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58766.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58766.log.gz 2026-03-31T20:41:57.035 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28805.log 2026-03-31T20:41:57.035 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42836.log 2026-03-31T20:41:57.035 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30134.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30134.log.gz 2026-03-31T20:41:57.035 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28805.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28805.log.gz 2026-03-31T20:41:57.035 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49822.log 2026-03-31T20:41:57.036 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29079.log 2026-03-31T20:41:57.036 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42836.log: /var/log/ceph/ceph-client.admin.49822.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42836.log.gz 2026-03-31T20:41:57.036 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49822.log.gz 2026-03-31T20:41:57.036 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45532.log 2026-03-31T20:41:57.036 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27579.log 2026-03-31T20:41:57.036 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29079.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29079.log.gz 2026-03-31T20:41:57.037 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45532.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45532.log.gz 2026-03-31T20:41:57.037 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70280.log 2026-03-31T20:41:57.037 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47221.log 2026-03-31T20:41:57.037 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27579.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27579.log.gz 2026-03-31T20:41:57.037 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70280.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70280.log.gz 2026-03-31T20:41:57.037 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42476.log 2026-03-31T20:41:57.037 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40433.log 2026-03-31T20:41:57.038 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47221.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47221.log.gz 2026-03-31T20:41:57.038 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42476.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42476.log.gz 2026-03-31T20:41:57.038 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42552.log 2026-03-31T20:41:57.038 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26849.log 2026-03-31T20:41:57.038 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40433.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40433.log.gz 2026-03-31T20:41:57.038 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42552.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42552.log.gz 2026-03-31T20:41:57.039 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53602.log 2026-03-31T20:41:57.039 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68673.log 2026-03-31T20:41:57.039 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26849.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26849.log.gz 2026-03-31T20:41:57.039 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53602.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53602.log.gz 2026-03-31T20:41:57.039 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61472.log 2026-03-31T20:41:57.039 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59608.log 2026-03-31T20:41:57.039 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68673.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68673.log.gz 2026-03-31T20:41:57.039 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61472.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61472.log.gz 2026-03-31T20:41:57.040 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39900.log 2026-03-31T20:41:57.040 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28238.log 2026-03-31T20:41:57.040 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59608.log: /var/log/ceph/ceph-client.admin.39900.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59608.log.gz 2026-03-31T20:41:57.040 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39900.log.gz 2026-03-31T20:41:57.040 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36587.log 2026-03-31T20:41:57.040 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46019.log 2026-03-31T20:41:57.041 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28238.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28238.log.gz 2026-03-31T20:41:57.041 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36587.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36587.log.gz 2026-03-31T20:41:57.041 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59922.log 2026-03-31T20:41:57.041 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51783.log 2026-03-31T20:41:57.041 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46019.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46019.log.gz 2026-03-31T20:41:57.041 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59922.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59922.log.gz 2026-03-31T20:41:57.041 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31289.log 2026-03-31T20:41:57.042 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62290.log 2026-03-31T20:41:57.042 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.51783.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51783.log.gz 2026-03-31T20:41:57.042 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31289.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31289.log.gz 2026-03-31T20:41:57.042 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45034.log 2026-03-31T20:41:57.042 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58121.log 2026-03-31T20:41:57.042 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62290.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62290.log.gz 2026-03-31T20:41:57.042 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45034.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45034.log.gz 2026-03-31T20:41:57.043 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25958.log 2026-03-31T20:41:57.043 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43496.log 2026-03-31T20:41:57.043 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58121.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58121.log.gz 2026-03-31T20:41:57.043 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.25958.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25958.log.gz 2026-03-31T20:41:57.043 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54642.log 2026-03-31T20:41:57.043 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51050.log 2026-03-31T20:41:57.044 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43496.log: /var/log/ceph/ceph-client.admin.54642.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54642.log.gz 2026-03-31T20:41:57.044 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43496.log.gz 2026-03-31T20:41:57.044 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61047.log 2026-03-31T20:41:57.044 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30308.log 2026-03-31T20:41:57.044 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.51050.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51050.log.gz 2026-03-31T20:41:57.044 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61047.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61047.log.gz 2026-03-31T20:41:57.045 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56253.log 2026-03-31T20:41:57.045 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39431.log 2026-03-31T20:41:57.045 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30308.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30308.log.gz 2026-03-31T20:41:57.045 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56253.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56253.log.gz 2026-03-31T20:41:57.045 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56470.log 2026-03-31T20:41:57.045 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60156.log 2026-03-31T20:41:57.045 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39431.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39431.log.gz 2026-03-31T20:41:57.046 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56470.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56470.log.gz 2026-03-31T20:41:57.046 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63968.log 2026-03-31T20:41:57.046 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77825.log 2026-03-31T20:41:57.046 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60156.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60156.log.gz 2026-03-31T20:41:57.046 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63968.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63968.log.gz 2026-03-31T20:41:57.046 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36896.log 2026-03-31T20:41:57.046 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68949.log 2026-03-31T20:41:57.047 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.77825.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77825.log.gz 2026-03-31T20:41:57.047 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36896.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36896.log.gz 2026-03-31T20:41:57.047 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31789.log 2026-03-31T20:41:57.047 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29959.log 2026-03-31T20:41:57.047 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68949.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68949.log.gz 2026-03-31T20:41:57.047 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31789.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31789.log.gz 2026-03-31T20:41:57.047 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35460.log 2026-03-31T20:41:57.048 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29959.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29959.log.gz 2026-03-31T20:41:57.048 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49319.log 2026-03-31T20:41:57.048 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35460.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59152.log 2026-03-31T20:41:57.048 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35460.log.gz 2026-03-31T20:41:57.048 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49319.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49319.log.gz 2026-03-31T20:41:57.049 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69486.log 2026-03-31T20:41:57.049 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63445.log 2026-03-31T20:41:57.049 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59152.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59152.log.gz 2026-03-31T20:41:57.049 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69486.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69486.log.gz 2026-03-31T20:41:57.049 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60134.log 2026-03-31T20:41:57.049 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30803.log 2026-03-31T20:41:57.050 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63445.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63445.log.gz 2026-03-31T20:41:57.050 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60134.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60134.log.gz 2026-03-31T20:41:57.050 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29665.log 2026-03-31T20:41:57.050 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30419.log 2026-03-31T20:41:57.050 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30803.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30803.log.gz 2026-03-31T20:41:57.050 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29665.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29665.log.gz 2026-03-31T20:41:57.050 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30755.log 2026-03-31T20:41:57.051 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37780.log 2026-03-31T20:41:57.051 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30419.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30419.log.gz 2026-03-31T20:41:57.051 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30755.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30755.log.gz 2026-03-31T20:41:57.051 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45751.log 2026-03-31T20:41:57.051 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48887.log 2026-03-31T20:41:57.051 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37780.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37780.log.gz 2026-03-31T20:41:57.051 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45751.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45751.log.gz 2026-03-31T20:41:57.051 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56757.log 2026-03-31T20:41:57.052 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66771.log 2026-03-31T20:41:57.052 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48887.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48887.log.gz 2026-03-31T20:41:57.052 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56757.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56757.log.gz 2026-03-31T20:41:57.052 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66278.log 2026-03-31T20:41:57.052 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71553.log 2026-03-31T20:41:57.052 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66771.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66771.log.gz 2026-03-31T20:41:57.053 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66278.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66278.log.gz 2026-03-31T20:41:57.053 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31313.log 2026-03-31T20:41:57.053 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47944.log 2026-03-31T20:41:57.053 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.71553.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71553.log.gz 2026-03-31T20:41:57.053 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31313.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31313.log.gz 2026-03-31T20:41:57.053 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70450.log 2026-03-31T20:41:57.053 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60398.log 2026-03-31T20:41:57.054 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47944.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47944.log.gz 2026-03-31T20:41:57.054 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70450.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70450.log.gz 2026-03-31T20:41:57.054 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34116.log 2026-03-31T20:41:57.054 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27723.log 2026-03-31T20:41:57.054 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60398.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60398.log.gz 2026-03-31T20:41:57.054 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34116.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34116.log.gz 2026-03-31T20:41:57.054 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29129.log 2026-03-31T20:41:57.055 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59511.log 2026-03-31T20:41:57.055 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27723.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27723.log.gz 2026-03-31T20:41:57.055 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29129.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29129.log.gz 2026-03-31T20:41:57.055 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30158.log 2026-03-31T20:41:57.055 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55606.log 2026-03-31T20:41:57.055 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59511.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59511.log.gz 2026-03-31T20:41:57.055 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30158.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30158.log.gz 2026-03-31T20:41:57.056 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31363.log 2026-03-31T20:41:57.056 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34188.log 2026-03-31T20:41:57.056 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55606.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55606.log.gz 2026-03-31T20:41:57.056 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31363.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31363.log.gz 2026-03-31T20:41:57.056 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph.tmp-client.admin.19120.log 2026-03-31T20:41:57.056 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42681.log 2026-03-31T20:41:57.057 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34188.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34188.log.gz 2026-03-31T20:41:57.057 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph.tmp-client.admin.19120.log: 0.0% -- replaced with /var/log/ceph/ceph.tmp-client.admin.19120.log.gz 2026-03-31T20:41:57.057 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31071.log 2026-03-31T20:41:57.057 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38231.log 2026-03-31T20:41:57.057 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42681.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42681.log.gz 2026-03-31T20:41:57.057 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31071.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31071.log.gz 2026-03-31T20:41:57.057 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44566.log 2026-03-31T20:41:57.057 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48911.log 2026-03-31T20:41:57.058 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38231.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38231.log.gz 2026-03-31T20:41:57.058 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44566.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44566.log.gz 2026-03-31T20:41:57.058 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68415.log 2026-03-31T20:41:57.058 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55415.log 2026-03-31T20:41:57.058 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48911.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48911.log.gz 2026-03-31T20:41:57.058 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68415.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68415.log.gz 2026-03-31T20:41:57.058 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30356.log 2026-03-31T20:41:57.059 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43908.log 2026-03-31T20:41:57.059 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55415.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55415.log.gz 2026-03-31T20:41:57.059 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30356.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30356.log.gz 2026-03-31T20:41:57.059 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48124.log 2026-03-31T20:41:57.059 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68928.log 2026-03-31T20:41:57.059 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43908.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43908.log.gz 2026-03-31T20:41:57.060 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48124.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48124.log.gz 2026-03-31T20:41:57.060 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32183.log 2026-03-31T20:41:57.060 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54593.log 2026-03-31T20:41:57.060 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68928.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68928.log.gz 2026-03-31T20:41:57.060 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32183.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32183.log.gz 2026-03-31T20:41:57.060 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63725.log 2026-03-31T20:41:57.061 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58580.log 2026-03-31T20:41:57.061 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54593.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54593.log.gz 2026-03-31T20:41:57.061 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63725.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63725.log.gz 2026-03-31T20:41:57.061 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26728.log 2026-03-31T20:41:57.061 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42095.log 2026-03-31T20:41:57.061 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58580.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58580.log.gz 2026-03-31T20:41:57.061 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26728.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26728.log.gz 2026-03-31T20:41:57.062 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60044.log 2026-03-31T20:41:57.062 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55292.log 2026-03-31T20:41:57.062 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42095.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42095.log.gz 2026-03-31T20:41:57.062 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60044.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60044.log.gz 2026-03-31T20:41:57.062 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67621.log 2026-03-31T20:41:57.062 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78386.log 2026-03-31T20:41:57.063 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55292.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55292.log.gz 2026-03-31T20:41:57.063 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67621.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67621.log.gz 2026-03-31T20:41:57.063 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42990.log 2026-03-31T20:41:57.063 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78471.log 2026-03-31T20:41:57.063 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78386.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78386.log.gz 2026-03-31T20:41:57.063 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42990.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42990.log.gz 2026-03-31T20:41:57.063 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43267.log 2026-03-31T20:41:57.064 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43267.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40586.log 2026-03-31T20:41:57.064 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43267.log.gz 2026-03-31T20:41:57.064 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78471.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78471.log.gz 2026-03-31T20:41:57.064 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53830.log 2026-03-31T20:41:57.064 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27627.log 2026-03-31T20:41:57.065 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40586.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40586.log.gz 2026-03-31T20:41:57.065 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53830.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53830.log.gz 2026-03-31T20:41:57.065 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40077.log 2026-03-31T20:41:57.065 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67646.log 2026-03-31T20:41:57.065 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27627.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27627.log.gz 2026-03-31T20:41:57.065 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40077.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40077.log.gz 2026-03-31T20:41:57.065 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57363.log 2026-03-31T20:41:57.066 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70835.log 2026-03-31T20:41:57.066 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67646.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67646.log.gz 2026-03-31T20:41:57.066 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57363.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57363.log.gz 2026-03-31T20:41:57.066 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35560.log 2026-03-31T20:41:57.066 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68041.log 2026-03-31T20:41:57.066 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70835.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70835.log.gz 2026-03-31T20:41:57.066 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35560.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35560.log.gz 2026-03-31T20:41:57.067 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66929.log 2026-03-31T20:41:57.067 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47510.log 2026-03-31T20:41:57.067 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68041.log: /var/log/ceph/ceph-client.admin.66929.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68041.log.gz 2026-03-31T20:41:57.067 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66929.log.gz 2026-03-31T20:41:57.067 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62092.log 2026-03-31T20:41:57.067 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46043.log 2026-03-31T20:41:57.067 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47510.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47510.log.gz 2026-03-31T20:41:57.068 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62092.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62092.log.gz 2026-03-31T20:41:57.068 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34308.log 2026-03-31T20:41:57.068 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70091.log 2026-03-31T20:41:57.068 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46043.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46043.log.gz 2026-03-31T20:41:57.068 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34308.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34308.log.gz 2026-03-31T20:41:57.068 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58717.log 2026-03-31T20:41:57.068 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.22808.log 2026-03-31T20:41:57.069 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70091.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70091.log.gz 2026-03-31T20:41:57.069 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58717.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58717.log.gz 2026-03-31T20:41:57.069 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57580.log 2026-03-31T20:41:57.069 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67794.log 2026-03-31T20:41:57.069 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.22808.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.22808.log.gz 2026-03-31T20:41:57.069 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57580.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57580.log.gz 2026-03-31T20:41:57.069 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41893.log 2026-03-31T20:41:57.070 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26140.log 2026-03-31T20:41:57.070 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67794.log: /var/log/ceph/ceph-client.admin.41893.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67794.log.gz 2026-03-31T20:41:57.070 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41893.log.gz 2026-03-31T20:41:57.070 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62488.log 2026-03-31T20:41:57.070 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54104.log 2026-03-31T20:41:57.070 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26140.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26140.log.gz 2026-03-31T20:41:57.070 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62488.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62488.log.gz 2026-03-31T20:41:57.071 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77937.log 2026-03-31T20:41:57.071 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40127.log 2026-03-31T20:41:57.071 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54104.log: /var/log/ceph/ceph-client.admin.77937.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54104.log.gz 2026-03-31T20:41:57.071 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77937.log.gz 2026-03-31T20:41:57.071 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59007.log 2026-03-31T20:41:57.071 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46775.log 2026-03-31T20:41:57.072 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40127.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40127.log.gz 2026-03-31T20:41:57.072 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59007.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59007.log.gz 2026-03-31T20:41:57.072 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45106.log 2026-03-31T20:41:57.072 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47705.log 2026-03-31T20:41:57.072 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46775.log: /var/log/ceph/ceph-client.admin.45106.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46775.log.gz 2026-03-31T20:41:57.072 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45106.log.gz 2026-03-31T20:41:57.072 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43040.log 2026-03-31T20:41:57.073 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54262.log 2026-03-31T20:41:57.073 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47705.log: /var/log/ceph/ceph-client.admin.43040.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47705.log.gz 2026-03-31T20:41:57.073 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43040.log.gz 2026-03-31T20:41:57.073 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68695.log 2026-03-31T20:41:57.073 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37378.log 2026-03-31T20:41:57.073 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54262.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54262.log.gz 2026-03-31T20:41:57.073 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68695.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68695.log.gz 2026-03-31T20:41:57.073 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46435.log 2026-03-31T20:41:57.074 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62842.log 2026-03-31T20:41:57.074 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37378.log: /var/log/ceph/ceph-client.admin.46435.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37378.log.gz 2026-03-31T20:41:57.074 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46435.log.gz 2026-03-31T20:41:57.074 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54617.log 2026-03-31T20:41:57.074 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27793.log 2026-03-31T20:41:57.074 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62842.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62842.log.gz 2026-03-31T20:41:57.075 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54617.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54617.log.gz 2026-03-31T20:41:57.075 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin2.32764.log 2026-03-31T20:41:57.075 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36079.log 2026-03-31T20:41:57.075 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27793.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27793.log.gz 2026-03-31T20:41:57.075 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin2.32764.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin2.32764.log.gz 2026-03-31T20:41:57.075 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54713.log 2026-03-31T20:41:57.076 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66825.log 2026-03-31T20:41:57.076 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36079.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36079.log.gz 2026-03-31T20:41:57.076 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54713.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54713.log.gz 2026-03-31T20:41:57.076 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52737.log 2026-03-31T20:41:57.076 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66825.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66825.log.gz 2026-03-31T20:41:57.076 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37609.log 2026-03-31T20:41:57.077 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44663.log 2026-03-31T20:41:57.077 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52737.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52737.log.gz 2026-03-31T20:41:57.077 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37609.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37609.log.gz 2026-03-31T20:41:57.077 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47025.log 2026-03-31T20:41:57.077 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41456.log 2026-03-31T20:41:57.077 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44663.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44663.log.gz 2026-03-31T20:41:57.078 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47025.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47025.log.gz 2026-03-31T20:41:57.078 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55052.log 2026-03-31T20:41:57.078 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26457.log 2026-03-31T20:41:57.078 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41456.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41456.log.gz 2026-03-31T20:41:57.078 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55052.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55052.log.gz 2026-03-31T20:41:57.078 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40203.log 2026-03-31T20:41:57.078 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34804.log 2026-03-31T20:41:57.079 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26457.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26457.log.gz 2026-03-31T20:41:57.079 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40203.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40203.log.gz 2026-03-31T20:41:57.079 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46485.log 2026-03-31T20:41:57.079 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37426.log 2026-03-31T20:41:57.079 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34804.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34804.log.gz 2026-03-31T20:41:57.079 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46485.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46485.log.gz 2026-03-31T20:41:57.079 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30661.log 2026-03-31T20:41:57.080 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49629.log 2026-03-31T20:41:57.080 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37426.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37426.log.gz 2026-03-31T20:41:57.080 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30661.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30661.log.gz 2026-03-31T20:41:57.080 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51470.log 2026-03-31T20:41:57.080 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42400.log 2026-03-31T20:41:57.080 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49629.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49629.log.gz 2026-03-31T20:41:57.080 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.51470.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51470.log.gz 2026-03-31T20:41:57.081 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44855.log 2026-03-31T20:41:57.081 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27557.log 2026-03-31T20:41:57.081 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42400.log: /var/log/ceph/ceph-client.admin.44855.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42400.log.gz 2026-03-31T20:41:57.081 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44855.log.gz 2026-03-31T20:41:57.081 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38812.log 2026-03-31T20:41:57.081 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70547.log 2026-03-31T20:41:57.082 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27557.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27557.log.gz 2026-03-31T20:41:57.082 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38812.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38812.log.gz 2026-03-31T20:41:57.082 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42223.log 2026-03-31T20:41:57.082 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55989.log 2026-03-31T20:41:57.082 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70547.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70547.log.gz 2026-03-31T20:41:57.082 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42223.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42223.log.gz 2026-03-31T20:41:57.082 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69222.log 2026-03-31T20:41:57.082 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30637.log 2026-03-31T20:41:57.083 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55989.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55989.log.gz 2026-03-31T20:41:57.083 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69222.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69222.log.gz 2026-03-31T20:41:57.083 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63171.log 2026-03-31T20:41:57.083 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60303.log 2026-03-31T20:41:57.083 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30637.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30637.log.gz 2026-03-31T20:41:57.083 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63171.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63171.log.gz 2026-03-31T20:41:57.084 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70982.log 2026-03-31T20:41:57.084 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44518.log 2026-03-31T20:41:57.084 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60303.log: /var/log/ceph/ceph-client.admin.70982.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60303.log.gz 2026-03-31T20:41:57.084 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70982.log.gz 2026-03-31T20:41:57.084 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55317.log 2026-03-31T20:41:57.084 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48630.log 2026-03-31T20:41:57.085 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44518.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44518.log.gz 2026-03-31T20:41:57.085 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55317.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55317.log.gz 2026-03-31T20:41:57.085 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47558.log 2026-03-31T20:41:57.085 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43860.log 2026-03-31T20:41:57.085 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48630.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48630.log.gz 2026-03-31T20:41:57.085 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47558.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47558.log.gz 2026-03-31T20:41:57.085 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51338.log 2026-03-31T20:41:57.086 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70621.log 2026-03-31T20:41:57.086 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43860.log: /var/log/ceph/ceph-client.admin.51338.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43860.log.gz 2026-03-31T20:41:57.086 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51338.log.gz 2026-03-31T20:41:57.086 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45010.log 2026-03-31T20:41:57.086 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45945.log 2026-03-31T20:41:57.086 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70621.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70621.log.gz 2026-03-31T20:41:57.086 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45010.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45010.log.gz 2026-03-31T20:41:57.087 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.23025.log 2026-03-31T20:41:57.087 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67993.log 2026-03-31T20:41:57.087 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45945.log: /var/log/ceph/ceph-client.admin.23025.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45945.log.gz 2026-03-31T20:41:57.087 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.23025.log.gz 2026-03-31T20:41:57.087 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58228.log 2026-03-31T20:41:57.087 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55390.log 2026-03-31T20:41:57.088 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67993.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67993.log.gz 2026-03-31T20:41:57.088 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58228.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58228.log.gz 2026-03-31T20:41:57.088 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36250.log 2026-03-31T20:41:57.088 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44542.log 2026-03-31T20:41:57.088 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55390.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55390.log.gz 2026-03-31T20:41:57.088 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36250.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36250.log.gz 2026-03-31T20:41:57.088 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38182.log 2026-03-31T20:41:57.088 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58345.log 2026-03-31T20:41:57.089 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44542.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44542.log.gz 2026-03-31T20:41:57.089 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38182.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38182.log.gz 2026-03-31T20:41:57.089 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78029.log 2026-03-31T20:41:57.089 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77701.log 2026-03-31T20:41:57.089 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58345.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58345.log.gz 2026-03-31T20:41:57.089 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78029.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78029.log.gz 2026-03-31T20:41:57.089 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55483.log 2026-03-31T20:41:57.090 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70375.log 2026-03-31T20:41:57.090 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.77701.log: /var/log/ceph/ceph-client.admin.55483.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77701.log.gz 2026-03-31T20:41:57.090 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55483.log.gz 2026-03-31T20:41:57.090 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49151.log 2026-03-31T20:41:57.090 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48196.log 2026-03-31T20:41:57.090 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70375.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70375.log.gz 2026-03-31T20:41:57.091 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49151.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49151.log.gz 2026-03-31T20:41:57.091 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29571.log 2026-03-31T20:41:57.091 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.bug.64439.log 2026-03-31T20:41:57.091 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48196.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48196.log.gz 2026-03-31T20:41:57.091 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29571.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29571.log.gz 2026-03-31T20:41:57.091 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29788.log 2026-03-31T20:41:57.091 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77751.log 2026-03-31T20:41:57.092 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29788.log: /var/log/ceph/ceph-client.bug.64439.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29788.log.gz 2026-03-31T20:41:57.092 INFO:teuthology.orchestra.run.vm03.stderr: 66.7% -- replaced with /var/log/ceph/ceph-client.bug.64439.log.gz 2026-03-31T20:41:57.092 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32652.log 2026-03-31T20:41:57.092 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56665.log 2026-03-31T20:41:57.093 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.77751.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77751.log.gz 2026-03-31T20:41:57.093 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32652.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32652.log.gz 2026-03-31T20:41:57.093 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68535.log 2026-03-31T20:41:57.093 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40407.log 2026-03-31T20:41:57.093 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56665.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56665.log.gz 2026-03-31T20:41:57.093 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68535.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68535.log.gz 2026-03-31T20:41:57.093 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48459.log 2026-03-31T20:41:57.093 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34212.log 2026-03-31T20:41:57.094 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40407.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40407.log.gz 2026-03-31T20:41:57.094 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48459.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48459.log.gz 2026-03-31T20:41:57.094 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47313.log 2026-03-31T20:41:57.094 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29104.log 2026-03-31T20:41:57.094 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34212.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34212.log.gz 2026-03-31T20:41:57.094 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47313.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47313.log.gz 2026-03-31T20:41:57.094 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29275.log 2026-03-31T20:41:57.095 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41944.log 2026-03-31T20:41:57.095 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29104.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29104.log.gz 2026-03-31T20:41:57.095 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29275.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29275.log.gz 2026-03-31T20:41:57.095 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57503.log 2026-03-31T20:41:57.095 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61908.log 2026-03-31T20:41:57.095 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41944.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41944.log.gz 2026-03-31T20:41:57.096 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57503.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57503.log.gz 2026-03-31T20:41:57.096 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29545.log 2026-03-31T20:41:57.096 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61828.log 2026-03-31T20:41:57.096 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61908.log: /var/log/ceph/ceph-client.admin.29545.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61908.log.gz 2026-03-31T20:41:57.096 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29545.log.gz 2026-03-31T20:41:57.096 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50763.log 2026-03-31T20:41:57.096 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48533.log 2026-03-31T20:41:57.097 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61828.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61828.log.gz 2026-03-31T20:41:57.097 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.50763.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50763.log.gz 2026-03-31T20:41:57.097 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28264.log 2026-03-31T20:41:57.097 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46281.log 2026-03-31T20:41:57.097 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48533.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48533.log.gz 2026-03-31T20:41:57.097 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28264.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28264.log.gz 2026-03-31T20:41:57.097 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43066.log 2026-03-31T20:41:57.098 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57958.log 2026-03-31T20:41:57.098 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46281.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46281.log.gz 2026-03-31T20:41:57.098 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43066.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43066.log.gz 2026-03-31T20:41:57.098 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52514.log 2026-03-31T20:41:57.098 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63237.log 2026-03-31T20:41:57.098 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57958.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57958.log.gz 2026-03-31T20:41:57.098 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52514.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52514.log.gz 2026-03-31T20:41:57.099 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49870.log 2026-03-31T20:41:57.099 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55655.log 2026-03-31T20:41:57.099 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63237.log: /var/log/ceph/ceph-client.admin.49870.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63237.log.gz 2026-03-31T20:41:57.099 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49870.log.gz 2026-03-31T20:41:57.099 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71698.log 2026-03-31T20:41:57.099 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51024.log 2026-03-31T20:41:57.100 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55655.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55655.log.gz 2026-03-31T20:41:57.100 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.71698.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71698.log.gz 2026-03-31T20:41:57.100 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39188.log 2026-03-31T20:41:57.100 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37946.log 2026-03-31T20:41:57.100 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.51024.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51024.log.gz 2026-03-31T20:41:57.100 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39188.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39188.log.gz 2026-03-31T20:41:57.101 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30009.log 2026-03-31T20:41:57.101 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31534.log 2026-03-31T20:41:57.101 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37946.log: /var/log/ceph/ceph-client.admin.30009.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37946.log.gz 2026-03-31T20:41:57.101 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30009.log.gz 2026-03-31T20:41:57.101 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59825.log 2026-03-31T20:41:57.101 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37848.log 2026-03-31T20:41:57.101 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31534.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31534.log.gz 2026-03-31T20:41:57.102 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59825.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59825.log.gz 2026-03-31T20:41:57.102 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33839.log 2026-03-31T20:41:57.102 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63149.log 2026-03-31T20:41:57.102 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37848.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37848.log.gz 2026-03-31T20:41:57.102 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.33839.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33839.log.gz 2026-03-31T20:41:57.102 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78543.log 2026-03-31T20:41:57.102 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62224.log 2026-03-31T20:41:57.103 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63149.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63149.log.gz 2026-03-31T20:41:57.103 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78543.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78543.log.gz 2026-03-31T20:41:57.103 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55918.log 2026-03-31T20:41:57.103 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66072.log 2026-03-31T20:41:57.103 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62224.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62224.log.gz 2026-03-31T20:41:57.103 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55918.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55918.log.gz 2026-03-31T20:41:57.103 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37474.log 2026-03-31T20:41:57.104 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66072.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66072.log.gz 2026-03-31T20:41:57.104 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44967.log 2026-03-31T20:41:57.104 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-rd.33508.log 2026-03-31T20:41:57.104 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37474.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37474.log.gz 2026-03-31T20:41:57.104 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44967.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44967.log.gz 2026-03-31T20:41:57.104 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37450.log 2026-03-31T20:41:57.105 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42020.log 2026-03-31T20:41:57.105 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-rd.33508.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-rd.33508.log.gz 2026-03-31T20:41:57.105 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37450.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37450.log.gz 2026-03-31T20:41:57.105 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26605.log 2026-03-31T20:41:57.105 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32910.log 2026-03-31T20:41:57.105 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42020.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42020.log.gz 2026-03-31T20:41:57.106 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26605.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26605.log.gz 2026-03-31T20:41:57.106 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64743.log 2026-03-31T20:41:57.106 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28124.log 2026-03-31T20:41:57.106 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32910.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32910.log.gz 2026-03-31T20:41:57.106 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64743.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64743.log.gz 2026-03-31T20:41:57.106 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36128.log 2026-03-31T20:41:57.106 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31459.log 2026-03-31T20:41:57.107 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28124.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28124.log.gz 2026-03-31T20:41:57.107 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36128.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36128.log.gz 2026-03-31T20:41:57.107 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55723.log 2026-03-31T20:41:57.107 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55940.log 2026-03-31T20:41:57.107 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31459.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31459.log.gz 2026-03-31T20:41:57.107 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55723.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55723.log.gz 2026-03-31T20:41:57.107 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28525.log 2026-03-31T20:41:57.108 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31968.log 2026-03-31T20:41:57.108 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55940.log: /var/log/ceph/ceph-client.admin.28525.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55940.log.gz 2026-03-31T20:41:57.108 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28525.log.gz 2026-03-31T20:41:57.108 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41661.log 2026-03-31T20:41:57.108 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51312.log 2026-03-31T20:41:57.109 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31968.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31968.log.gz 2026-03-31T20:41:57.109 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41661.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41661.log.gz 2026-03-31T20:41:57.109 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42199.log 2026-03-31T20:41:57.109 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31942.log 2026-03-31T20:41:57.109 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.51312.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51312.log.gz 2026-03-31T20:41:57.109 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42199.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42199.log.gz 2026-03-31T20:41:57.109 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78364.log 2026-03-31T20:41:57.110 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26166.log 2026-03-31T20:41:57.110 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31942.log: /var/log/ceph/ceph-client.admin.78364.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31942.log.gz 2026-03-31T20:41:57.110 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78364.log.gz 2026-03-31T20:41:57.110 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40329.log 2026-03-31T20:41:57.110 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65448.log 2026-03-31T20:41:57.110 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26166.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26166.log.gz 2026-03-31T20:41:57.110 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40329.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40329.log.gz 2026-03-31T20:41:57.111 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41127.log 2026-03-31T20:41:57.111 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38394.log 2026-03-31T20:41:57.111 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65448.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65448.log.gz 2026-03-31T20:41:57.111 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41127.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41127.log.gz 2026-03-31T20:41:57.111 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49199.log 2026-03-31T20:41:57.111 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47073.log 2026-03-31T20:41:57.111 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38394.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38394.log.gz 2026-03-31T20:41:57.112 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49199.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49199.log.gz 2026-03-31T20:41:57.112 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37735.log 2026-03-31T20:41:57.112 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56228.log 2026-03-31T20:41:57.112 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47073.log: /var/log/ceph/ceph-client.admin.37735.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47073.log.gz 2026-03-31T20:41:57.112 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37735.log.gz 2026-03-31T20:41:57.112 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32606.log 2026-03-31T20:41:57.112 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67427.log 2026-03-31T20:41:57.113 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56228.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56228.log.gz 2026-03-31T20:41:57.113 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32606.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32606.log.gz 2026-03-31T20:41:57.113 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57162.log 2026-03-31T20:41:57.113 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36350.log 2026-03-31T20:41:57.113 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67427.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67427.log.gz 2026-03-31T20:41:57.113 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57162.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57162.log.gz 2026-03-31T20:41:57.113 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66718.log 2026-03-31T20:41:57.114 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64984.log 2026-03-31T20:41:57.114 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36350.log: /var/log/ceph/ceph-client.admin.66718.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36350.log.gz 2026-03-31T20:41:57.114 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66718.log.gz 2026-03-31T20:41:57.114 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45355.log 2026-03-31T20:41:57.114 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26408.log 2026-03-31T20:41:57.114 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64984.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64984.log.gz 2026-03-31T20:41:57.115 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45355.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45355.log.gz 2026-03-31T20:41:57.115 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27443.log 2026-03-31T20:41:57.115 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59874.log 2026-03-31T20:41:57.115 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26408.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26408.log.gz 2026-03-31T20:41:57.115 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27443.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27443.log.gz 2026-03-31T20:41:57.115 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58091.log 2026-03-31T20:41:57.115 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31763.log 2026-03-31T20:41:57.116 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59874.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59874.log.gz 2026-03-31T20:41:57.116 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58091.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64621.log 2026-03-31T20:41:57.116 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58091.log.gz 2026-03-31T20:41:57.116 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31763.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31763.log.gz 2026-03-31T20:41:57.116 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32257.log 2026-03-31T20:41:57.117 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38934.log 2026-03-31T20:41:57.117 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64621.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64621.log.gz 2026-03-31T20:41:57.117 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32257.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32257.log.gz 2026-03-31T20:41:57.117 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59390.log 2026-03-31T20:41:57.117 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60780.log 2026-03-31T20:41:57.117 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38934.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38934.log.gz 2026-03-31T20:41:57.117 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59390.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59390.log.gz 2026-03-31T20:41:57.118 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49677.log 2026-03-31T20:41:57.118 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57311.log 2026-03-31T20:41:57.118 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60780.log: /var/log/ceph/ceph-client.admin.49677.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60780.log.gz 2026-03-31T20:41:57.118 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49677.log.gz 2026-03-31T20:41:57.118 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39333.log 2026-03-31T20:41:57.118 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62019.log 2026-03-31T20:41:57.118 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57311.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57311.log.gz 2026-03-31T20:41:57.119 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39333.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39333.log.gz 2026-03-31T20:41:57.119 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.22989.log 2026-03-31T20:41:57.119 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54292.log 2026-03-31T20:41:57.119 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62019.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62019.log.gz 2026-03-31T20:41:57.119 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.22989.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.22989.log.gz 2026-03-31T20:41:57.119 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46579.log 2026-03-31T20:41:57.120 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46579.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46579.log.gz 2026-03-31T20:41:57.120 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55175.log 2026-03-31T20:41:57.120 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54292.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54292.log.gz 2026-03-31T20:41:57.120 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26041.log 2026-03-31T20:41:57.120 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55175.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55175.log.gz 2026-03-31T20:41:57.120 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56087.log 2026-03-31T20:41:57.121 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26041.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26041.log.gz 2026-03-31T20:41:57.121 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27675.log 2026-03-31T20:41:57.121 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57761.log 2026-03-31T20:41:57.121 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56087.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56087.log.gz 2026-03-31T20:41:57.121 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27675.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27675.log.gz 2026-03-31T20:41:57.121 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26801.log 2026-03-31T20:41:57.122 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52201.log 2026-03-31T20:41:57.122 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57761.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57761.log.gz 2026-03-31T20:41:57.122 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26801.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26801.log.gz 2026-03-31T20:41:57.122 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46090.log 2026-03-31T20:41:57.122 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48384.log 2026-03-31T20:41:57.122 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52201.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52201.log.gz 2026-03-31T20:41:57.122 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46090.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46090.log.gz 2026-03-31T20:41:57.123 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68244.log 2026-03-31T20:41:57.123 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67163.log 2026-03-31T20:41:57.123 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48384.log: /var/log/ceph/ceph-client.admin.68244.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48384.log.gz 2026-03-31T20:41:57.123 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68244.log.gz 2026-03-31T20:41:57.123 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57184.log 2026-03-31T20:41:57.123 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33814.log 2026-03-31T20:41:57.123 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67163.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67163.log.gz 2026-03-31T20:41:57.124 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68788.log 2026-03-31T20:41:57.124 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57184.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57184.log.gz 2026-03-31T20:41:57.124 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.33814.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33814.log.gz 2026-03-31T20:41:57.124 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60663.log 2026-03-31T20:41:57.124 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39260.log 2026-03-31T20:41:57.125 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68788.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68788.log.gz 2026-03-31T20:41:57.125 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60663.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60663.log.gz 2026-03-31T20:41:57.125 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54886.log 2026-03-31T20:41:57.125 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41867.log 2026-03-31T20:41:57.125 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39260.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39260.log.gz 2026-03-31T20:41:57.125 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54886.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54886.log.gz 2026-03-31T20:41:57.125 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42604.log 2026-03-31T20:41:57.126 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44759.log 2026-03-31T20:41:57.126 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41867.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41867.log.gz 2026-03-31T20:41:57.126 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42604.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42604.log.gz 2026-03-31T20:41:57.126 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35534.log 2026-03-31T20:41:57.126 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57686.log 2026-03-31T20:41:57.126 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44759.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44759.log.gz 2026-03-31T20:41:57.126 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35534.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35534.log.gz 2026-03-31T20:41:57.127 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68271.log 2026-03-31T20:41:57.127 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27836.log 2026-03-31T20:41:57.127 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57686.log: /var/log/ceph/ceph-client.admin.68271.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57686.log.gz 2026-03-31T20:41:57.127 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68271.log.gz 2026-03-31T20:41:57.127 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53440.log 2026-03-31T20:41:57.127 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70883.log 2026-03-31T20:41:57.128 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27836.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27836.log.gz 2026-03-31T20:41:57.128 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53440.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53440.log.gz 2026-03-31T20:41:57.128 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35195.log 2026-03-31T20:41:57.128 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63421.log 2026-03-31T20:41:57.128 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70883.log: /var/log/ceph/ceph-client.admin.35195.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70883.log.gz 2026-03-31T20:41:57.128 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35195.log.gz 2026-03-31T20:41:57.128 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43141.log 2026-03-31T20:41:57.128 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67943.log 2026-03-31T20:41:57.129 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63421.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63421.log.gz 2026-03-31T20:41:57.129 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43141.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43141.log.gz 2026-03-31T20:41:57.129 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32431.log 2026-03-31T20:41:57.129 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29227.log 2026-03-31T20:41:57.129 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67943.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67943.log.gz 2026-03-31T20:41:57.129 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32431.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32431.log.gz 2026-03-31T20:41:57.129 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53977.log 2026-03-31T20:41:57.130 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47751.log 2026-03-31T20:41:57.130 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29227.log: /var/log/ceph/ceph-client.admin.53977.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29227.log.gz 2026-03-31T20:41:57.130 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53977.log.gz 2026-03-31T20:41:57.130 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43836.log 2026-03-31T20:41:57.130 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67918.log 2026-03-31T20:41:57.130 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47751.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47751.log.gz 2026-03-31T20:41:57.131 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43836.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43836.log.gz 2026-03-31T20:41:57.131 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52175.log 2026-03-31T20:41:57.131 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57337.log 2026-03-31T20:41:57.131 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67918.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67918.log.gz 2026-03-31T20:41:57.131 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52175.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52175.log.gz 2026-03-31T20:41:57.131 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27999.log 2026-03-31T20:41:57.131 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70908.log 2026-03-31T20:41:57.132 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27999.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27999.log.gz 2026-03-31T20:41:57.132 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57337.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57337.log.gz 2026-03-31T20:41:57.132 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41252.log 2026-03-31T20:41:57.132 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42348.log 2026-03-31T20:41:57.132 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70908.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70908.log.gz 2026-03-31T20:41:57.133 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41252.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41252.log.gz 2026-03-31T20:41:57.133 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46310.log 2026-03-31T20:41:57.133 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29326.log 2026-03-31T20:41:57.133 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42348.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42348.log.gz 2026-03-31T20:41:57.133 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46310.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46310.log.gz 2026-03-31T20:41:57.133 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41968.log 2026-03-31T20:41:57.133 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54979.log 2026-03-31T20:41:57.134 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29326.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29326.log.gz 2026-03-31T20:41:57.134 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41968.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41968.log.gz 2026-03-31T20:41:57.134 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40562.log 2026-03-31T20:41:57.134 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68115.log 2026-03-31T20:41:57.134 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54979.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54979.log.gz 2026-03-31T20:41:57.134 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40562.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40562.log.gz 2026-03-31T20:41:57.134 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78591.log 2026-03-31T20:41:57.135 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38206.log 2026-03-31T20:41:57.135 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68115.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68115.log.gz 2026-03-31T20:41:57.135 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78591.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78591.log.gz 2026-03-31T20:41:57.135 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32580.log 2026-03-31T20:41:57.135 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54784.log 2026-03-31T20:41:57.135 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38206.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38206.log.gz 2026-03-31T20:41:57.135 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32580.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32580.log.gz 2026-03-31T20:41:57.136 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70715.log 2026-03-31T20:41:57.136 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69585.log 2026-03-31T20:41:57.136 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54784.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54784.log.gz 2026-03-31T20:41:57.136 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70715.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70715.log.gz 2026-03-31T20:41:57.136 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68906.log 2026-03-31T20:41:57.136 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27243.log 2026-03-31T20:41:57.137 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69585.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69585.log.gz 2026-03-31T20:41:57.137 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68906.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68906.log.gz 2026-03-31T20:41:57.137 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25802.log 2026-03-31T20:41:57.137 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34487.log 2026-03-31T20:41:57.137 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27243.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27243.log.gz 2026-03-31T20:41:57.137 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.25802.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25802.log.gz 2026-03-31T20:41:57.137 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41328.log 2026-03-31T20:41:57.138 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42298.log 2026-03-31T20:41:57.138 INFO:teuthology.orchestra.run.vm03.stderr: 91.8%/var/log/ceph/ceph-client.admin.34487.log: -- replaced with /var/log/ceph/ceph-mgr.x.log.gz 2026-03-31T20:41:57.138 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34487.log.gz 2026-03-31T20:41:57.138 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41328.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41328.log.gz 2026-03-31T20:41:57.138 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37090.log 2026-03-31T20:41:57.138 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46138.log 2026-03-31T20:41:57.138 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42298.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42298.log.gz 2026-03-31T20:41:57.138 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35709.log 2026-03-31T20:41:57.139 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37090.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37090.log.gz 2026-03-31T20:41:57.139 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46138.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63992.log 2026-03-31T20:41:57.139 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46138.log.gz 2026-03-31T20:41:57.139 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48434.log 2026-03-31T20:41:57.139 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35709.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35709.log.gz 2026-03-31T20:41:57.139 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40973.log 2026-03-31T20:41:57.139 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63992.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63992.log.gz 2026-03-31T20:41:57.139 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48434.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29935.log 2026-03-31T20:41:57.139 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48434.log.gz 2026-03-31T20:41:57.139 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35007.log 2026-03-31T20:41:57.140 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40973.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40973.log.gz 2026-03-31T20:41:57.140 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29935.log: 0.0%gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53083.log 2026-03-31T20:41:57.140 INFO:teuthology.orchestra.run.vm03.stderr: -- replaced with /var/log/ceph/ceph-client.admin.29935.log.gz 2026-03-31T20:41:57.140 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35007.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66350.log 2026-03-31T20:41:57.140 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35007.log.gz 2026-03-31T20:41:57.140 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53083.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53083.log.gz 2026-03-31T20:41:57.141 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66692.log 2026-03-31T20:41:57.141 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68629.log 2026-03-31T20:41:57.141 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66350.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66350.log.gz 2026-03-31T20:41:57.141 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53313.log 2026-03-31T20:41:57.141 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66692.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66692.log.gz 2026-03-31T20:41:57.141 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68629.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33029.log 2026-03-31T20:41:57.141 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68629.log.gz 2026-03-31T20:41:57.141 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55821.log 2026-03-31T20:41:57.142 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53313.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53313.log.gz 2026-03-31T20:41:57.142 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.33029.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53806.log 2026-03-31T20:41:57.142 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33029.log.gz 2026-03-31T20:41:57.142 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55821.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47339.log 2026-03-31T20:41:57.142 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55821.log.gz 2026-03-31T20:41:57.142 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53806.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53806.log.gz 2026-03-31T20:41:57.142 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40869.log 2026-03-31T20:41:57.143 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-mon.c.log 2026-03-31T20:41:57.143 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47339.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47339.log.gz 2026-03-31T20:41:57.143 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63796.log 2026-03-31T20:41:57.143 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40869.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40869.log.gz 2026-03-31T20:41:57.143 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-mon.c.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60924.log 2026-03-31T20:41:57.143 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35509.log 2026-03-31T20:41:57.143 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63796.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63796.log.gz 2026-03-31T20:41:57.144 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60924.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60924.log.gz 2026-03-31T20:41:57.144 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43191.log 2026-03-31T20:41:57.144 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35509.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35509.log.gz 2026-03-31T20:41:57.144 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58742.log 2026-03-31T20:41:57.144 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53219.log 2026-03-31T20:41:57.145 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43191.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43191.log.gz 2026-03-31T20:41:57.145 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58742.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58742.log.gz 2026-03-31T20:41:57.145 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71524.log 2026-03-31T20:41:57.145 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39528.log 2026-03-31T20:41:57.145 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53219.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53219.log.gz 2026-03-31T20:41:57.145 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.71524.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71524.log.gz 2026-03-31T20:41:57.145 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54001.log 2026-03-31T20:41:57.146 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39528.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39528.log.gz 2026-03-31T20:41:57.146 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70066.log 2026-03-31T20:41:57.146 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27771.log 2026-03-31T20:41:57.146 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54001.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54001.log.gz 2026-03-31T20:41:57.146 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70066.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70066.log.gz 2026-03-31T20:41:57.146 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38641.log 2026-03-31T20:41:57.147 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27771.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27771.log.gz 2026-03-31T20:41:57.147 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59196.log 2026-03-31T20:41:57.147 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77800.log 2026-03-31T20:41:57.147 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38641.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38641.log.gz 2026-03-31T20:41:57.148 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59196.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59196.log.gz 2026-03-31T20:41:57.148 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31339.log 2026-03-31T20:41:57.148 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31484.log 2026-03-31T20:41:57.148 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.77800.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77800.log.gz 2026-03-31T20:41:57.148 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31339.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31339.log.gz 2026-03-31T20:41:57.148 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39284.log 2026-03-31T20:41:57.149 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31484.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31484.log.gz 2026-03-31T20:41:57.149 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59801.log 2026-03-31T20:41:57.149 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30207.log 2026-03-31T20:41:57.149 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39284.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39284.log.gz 2026-03-31T20:41:57.149 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59801.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59801.log.gz 2026-03-31T20:41:57.150 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65332.log 2026-03-31T20:41:57.150 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63077.log 2026-03-31T20:41:57.150 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30207.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30207.log.gz 2026-03-31T20:41:57.150 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65332.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65332.log.gz 2026-03-31T20:41:57.150 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48704.log 2026-03-31T20:41:57.150 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52968.log 2026-03-31T20:41:57.150 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63077.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63077.log.gz 2026-03-31T20:41:57.151 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48704.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48704.log.gz 2026-03-31T20:41:57.151 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65857.log 2026-03-31T20:41:57.151 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44297.log 2026-03-31T20:41:57.151 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52968.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52968.log.gz 2026-03-31T20:41:57.151 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65857.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65857.log.gz 2026-03-31T20:41:57.151 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58202.log 2026-03-31T20:41:57.152 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44297.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44297.log.gz 2026-03-31T20:41:57.152 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35733.log 2026-03-31T20:41:57.152 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39603.log 2026-03-31T20:41:57.152 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58202.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58202.log.gz 2026-03-31T20:41:57.152 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35733.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35733.log.gz 2026-03-31T20:41:57.152 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54691.log 2026-03-31T20:41:57.153 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39603.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39603.log.gz 2026-03-31T20:41:57.153 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34016.log 2026-03-31T20:41:57.153 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66849.log 2026-03-31T20:41:57.153 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54691.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54691.log.gz 2026-03-31T20:41:57.153 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34016.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34016.log.gz 2026-03-31T20:41:57.153 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39552.log 2026-03-31T20:41:57.154 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66849.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66849.log.gz 2026-03-31T20:41:57.154 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34042.log 2026-03-31T20:41:57.154 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52436.log 2026-03-31T20:41:57.154 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39552.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39552.log.gz 2026-03-31T20:41:57.155 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34042.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34042.log.gz 2026-03-31T20:41:57.155 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39030.log 2026-03-31T20:41:57.155 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34536.log 2026-03-31T20:41:57.155 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52436.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52436.log.gz 2026-03-31T20:41:57.155 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39030.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39030.log.gz 2026-03-31T20:41:57.155 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71447.log 2026-03-31T20:41:57.155 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-rw.33386.log 2026-03-31T20:41:57.156 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34536.log: /var/log/ceph/ceph-client.admin.71447.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71447.log.gz 2026-03-31T20:41:57.156 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34536.log.gz 2026-03-31T20:41:57.156 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54399.log 2026-03-31T20:41:57.156 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53657.log 2026-03-31T20:41:57.156 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-rw.33386.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-rw.33386.log.gz 2026-03-31T20:41:57.157 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54399.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54399.log.gz 2026-03-31T20:41:57.157 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40999.log 2026-03-31T20:41:57.157 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49846.log 2026-03-31T20:41:57.157 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53657.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53657.log.gz 2026-03-31T20:41:57.157 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40999.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40999.log.gz 2026-03-31T20:41:57.157 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61883.log 2026-03-31T20:41:57.158 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49846.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49846.log.gz 2026-03-31T20:41:57.158 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38416.log 2026-03-31T20:41:57.158 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51939.log 2026-03-31T20:41:57.158 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61883.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61883.log.gz 2026-03-31T20:41:57.158 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38416.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38416.log.gz 2026-03-31T20:41:57.158 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38044.log 2026-03-31T20:41:57.159 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.51939.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68719.log 2026-03-31T20:41:57.159 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51939.log.gz 2026-03-31T20:41:57.159 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38044.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38044.log.gzgzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31049.log 2026-03-31T20:41:57.159 INFO:teuthology.orchestra.run.vm03.stderr: 2026-03-31T20:41:57.159 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68719.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68719.log.gz 2026-03-31T20:41:57.160 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61993.log 2026-03-31T20:41:57.160 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47098.log 2026-03-31T20:41:57.160 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31049.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31049.log.gz 2026-03-31T20:41:57.160 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61993.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61993.log.gz 2026-03-31T20:41:57.160 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70165.log 2026-03-31T20:41:57.160 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58474.log 2026-03-31T20:41:57.160 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47098.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47098.log.gz 2026-03-31T20:41:57.161 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70165.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70165.log.gz 2026-03-31T20:41:57.161 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46887.log 2026-03-31T20:41:57.161 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57473.log 2026-03-31T20:41:57.161 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58474.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58474.log.gz 2026-03-31T20:41:57.161 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46887.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46887.log.gz 2026-03-31T20:41:57.161 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37567.log 2026-03-31T20:41:57.161 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40742.log 2026-03-31T20:41:57.162 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57473.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57473.log.gz 2026-03-31T20:41:57.162 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37567.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37567.log.gz 2026-03-31T20:41:57.162 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56111.log 2026-03-31T20:41:57.162 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28927.log 2026-03-31T20:41:57.162 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40742.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40742.log.gz 2026-03-31T20:41:57.162 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56111.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56111.log.gz 2026-03-31T20:41:57.162 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28146.log 2026-03-31T20:41:57.163 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28927.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28927.log.gz 2026-03-31T20:41:57.163 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30561.log 2026-03-31T20:41:57.163 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph.log 2026-03-31T20:41:57.163 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28146.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28146.log.gz 2026-03-31T20:41:57.164 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30561.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30561.log.gz 2026-03-31T20:41:57.164 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56616.log 2026-03-31T20:41:57.164 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78883.log 2026-03-31T20:41:57.164 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph.log: /var/log/ceph/ceph-client.admin.56616.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56616.log.gz 2026-03-31T20:41:57.168 INFO:teuthology.orchestra.run.vm03.stderr: 94.5% -- replaced with /var/log/ceph/ceph.log.gz 2026-03-31T20:41:57.168 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28078.log 2026-03-31T20:41:57.169 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53266.log 2026-03-31T20:41:57.169 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78883.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78883.log.gz 2026-03-31T20:41:57.169 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28078.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28078.log.gz 2026-03-31T20:41:57.169 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59898.log 2026-03-31T20:41:57.169 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46410.log 2026-03-31T20:41:57.169 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53266.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53266.log.gz 2026-03-31T20:41:57.170 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59898.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59898.log.gz 2026-03-31T20:41:57.170 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43955.log 2026-03-31T20:41:57.170 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46410.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46410.log.gz 2026-03-31T20:41:57.170 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55964.log 2026-03-31T20:41:57.170 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61247.log 2026-03-31T20:41:57.170 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43955.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43955.log.gz 2026-03-31T20:41:57.171 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55964.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55964.log.gz 2026-03-31T20:41:57.171 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36054.log 2026-03-31T20:41:57.171 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61247.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61247.log.gz 2026-03-31T20:41:57.171 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29858.log 2026-03-31T20:41:57.171 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41532.log 2026-03-31T20:41:57.172 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36054.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36054.log.gz 2026-03-31T20:41:57.172 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29858.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29858.log.gz 2026-03-31T20:41:57.172 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78102.log 2026-03-31T20:41:57.172 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32862.log 2026-03-31T20:41:57.172 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41532.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41532.log.gz 2026-03-31T20:41:57.172 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78102.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78102.log.gz 2026-03-31T20:41:57.172 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59220.log 2026-03-31T20:41:57.173 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27490.log 2026-03-31T20:41:57.173 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32862.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32862.log.gz 2026-03-31T20:41:57.173 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59220.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59220.log.gz 2026-03-31T20:41:57.173 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44711.log 2026-03-31T20:41:57.173 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44591.log 2026-03-31T20:41:57.173 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27490.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27490.log.gz 2026-03-31T20:41:57.173 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44711.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44711.log.gz 2026-03-31T20:41:57.174 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-osd.0.log 2026-03-31T20:41:57.174 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44591.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44591.log.gz 2026-03-31T20:41:57.174 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65124.log 2026-03-31T20:41:57.174 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69144.log 2026-03-31T20:41:57.175 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-osd.0.log: /var/log/ceph/ceph-client.admin.65124.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65124.log.gz 2026-03-31T20:41:57.188 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61422.log 2026-03-31T20:41:57.188 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69144.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69144.log.gz 2026-03-31T20:41:57.196 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71330.log 2026-03-31T20:41:57.196 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61422.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61422.log.gz 2026-03-31T20:41:57.196 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39976.log 2026-03-31T20:41:57.197 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.71330.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71330.log.gz 2026-03-31T20:41:57.197 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38541.log 2026-03-31T20:41:57.197 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39976.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39976.log.gz 2026-03-31T20:41:57.197 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69021.log 2026-03-31T20:41:57.198 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38541.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38541.log.gz 2026-03-31T20:41:57.198 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64352.log 2026-03-31T20:41:57.198 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69021.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69021.log.gz 2026-03-31T20:41:57.198 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49295.log 2026-03-31T20:41:57.199 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64352.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64352.log.gz 2026-03-31T20:41:57.199 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35361.log 2026-03-31T20:41:57.199 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49295.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49295.log.gz 2026-03-31T20:41:57.200 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60205.log 2026-03-31T20:41:57.200 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35361.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35361.log.gz 2026-03-31T20:41:57.200 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44393.log 2026-03-31T20:41:57.200 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60205.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60205.log.gz 2026-03-31T20:41:57.201 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50066.log 2026-03-31T20:41:57.201 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44393.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44393.log.gz 2026-03-31T20:41:57.201 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36103.log 2026-03-31T20:41:57.202 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.50066.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50066.log.gz 2026-03-31T20:41:57.202 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52850.log 2026-03-31T20:41:57.202 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36103.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36103.log.gz 2026-03-31T20:41:57.202 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54861.log 2026-03-31T20:41:57.203 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52850.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52850.log.gz 2026-03-31T20:41:57.203 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78450.log 2026-03-31T20:41:57.203 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54861.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54861.log.gz 2026-03-31T20:41:57.203 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-rw.33266.log 2026-03-31T20:41:57.204 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78450.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78450.log.gz 2026-03-31T20:41:57.204 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65378.log 2026-03-31T20:41:57.204 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-rw.33266.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-rw.33266.log.gz 2026-03-31T20:41:57.204 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34852.log 2026-03-31T20:41:57.205 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65378.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65378.log.gz 2026-03-31T20:41:57.205 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51574.log 2026-03-31T20:41:57.205 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34852.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34852.log.gz 2026-03-31T20:41:57.206 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33005.log 2026-03-31T20:41:57.206 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.51574.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51574.log.gz 2026-03-31T20:41:57.206 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63701.log 2026-03-31T20:41:57.206 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.33005.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33005.log.gz 2026-03-31T20:41:57.207 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26284.log 2026-03-31T20:41:57.207 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63701.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63701.log.gz 2026-03-31T20:41:57.207 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37330.log 2026-03-31T20:41:57.208 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26284.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26284.log.gz 2026-03-31T20:41:57.208 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32308.log 2026-03-31T20:41:57.208 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37330.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37330.log.gz 2026-03-31T20:41:57.208 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43413.log 2026-03-31T20:41:57.209 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32308.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32308.log.gz 2026-03-31T20:41:57.209 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46935.log 2026-03-31T20:41:57.209 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43413.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43413.log.gz 2026-03-31T20:41:57.209 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27888.log 2026-03-31T20:41:57.210 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46935.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46935.log.gz 2026-03-31T20:41:57.210 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61297.log 2026-03-31T20:41:57.210 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27888.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27888.log.gz 2026-03-31T20:41:57.210 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67029.log 2026-03-31T20:41:57.211 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61297.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61297.log.gz 2026-03-31T20:41:57.211 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45331.log 2026-03-31T20:41:57.211 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67029.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67029.log.gz 2026-03-31T20:41:57.212 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60734.log 2026-03-31T20:41:57.212 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45331.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45331.log.gz 2026-03-31T20:41:57.212 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55366.log 2026-03-31T20:41:57.212 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60734.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60734.log.gz 2026-03-31T20:41:57.213 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50685.log 2026-03-31T20:41:57.213 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55366.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55366.log.gz 2026-03-31T20:41:57.213 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57443.log 2026-03-31T20:41:57.213 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.50685.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50685.log.gz 2026-03-31T20:41:57.214 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35811.log 2026-03-31T20:41:57.214 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57443.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57443.log.gz 2026-03-31T20:41:57.214 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38861.log 2026-03-31T20:41:57.215 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35811.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35811.log.gz 2026-03-31T20:41:57.215 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51861.log 2026-03-31T20:41:57.215 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38861.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38861.log.gz 2026-03-31T20:41:57.215 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45130.log 2026-03-31T20:41:57.216 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.51861.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51861.log.gz 2026-03-31T20:41:57.216 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52540.log 2026-03-31T20:41:57.216 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45130.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45130.log.gz 2026-03-31T20:41:57.217 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31229.log 2026-03-31T20:41:57.217 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52540.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52540.log.gz 2026-03-31T20:41:57.217 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63009.log 2026-03-31T20:41:57.217 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31229.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31229.log.gz 2026-03-31T20:41:57.218 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56062.log 2026-03-31T20:41:57.218 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63009.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63009.log.gz 2026-03-31T20:41:57.218 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71160.log 2026-03-31T20:41:57.218 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56062.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56062.log.gz 2026-03-31T20:41:57.219 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56446.log 2026-03-31T20:41:57.219 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.71160.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71160.log.gz 2026-03-31T20:41:57.219 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30258.log 2026-03-31T20:41:57.220 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56446.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56446.log.gz 2026-03-31T20:41:57.220 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31738.log 2026-03-31T20:41:57.220 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30258.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30258.log.gz 2026-03-31T20:41:57.220 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45429.log 2026-03-31T20:41:57.221 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31738.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31738.log.gz 2026-03-31T20:41:57.221 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51182.log 2026-03-31T20:41:57.221 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45429.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45429.log.gz 2026-03-31T20:41:57.221 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65424.log 2026-03-31T20:41:57.222 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.51182.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51182.log.gz 2026-03-31T20:41:57.222 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47243.log 2026-03-31T20:41:57.222 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65424.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65424.log.gz 2026-03-31T20:41:57.222 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52565.log 2026-03-31T20:41:57.223 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47243.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47243.log.gz 2026-03-31T20:41:57.223 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52944.log 2026-03-31T20:41:57.223 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52565.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52565.log.gz 2026-03-31T20:41:57.224 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32282.log 2026-03-31T20:41:57.224 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52944.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52944.log.gz 2026-03-31T20:41:57.224 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64273.log 2026-03-31T20:41:57.225 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32282.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32282.log.gz 2026-03-31T20:41:57.225 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63772.log 2026-03-31T20:41:57.225 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64273.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64273.log.gz 2026-03-31T20:41:57.225 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47655.log 2026-03-31T20:41:57.226 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63772.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63772.log.gz 2026-03-31T20:41:57.226 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56204.log 2026-03-31T20:41:57.226 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47655.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47655.log.gz 2026-03-31T20:41:57.226 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67746.log 2026-03-31T20:41:57.227 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56204.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56204.log.gz 2026-03-31T20:41:57.227 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30109.log 2026-03-31T20:41:57.227 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67746.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67746.log.gz 2026-03-31T20:41:57.227 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65287.log 2026-03-31T20:41:57.228 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30109.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30109.log.gz 2026-03-31T20:41:57.228 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31510.log 2026-03-31T20:41:57.228 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65287.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65287.log.gz 2026-03-31T20:41:57.229 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61597.log 2026-03-31T20:41:57.229 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31510.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31510.log.gz 2026-03-31T20:41:57.229 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26703.log 2026-03-31T20:41:57.229 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61597.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61597.log.gz 2026-03-31T20:41:57.230 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61802.log 2026-03-31T20:41:57.230 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26703.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26703.log.gz 2026-03-31T20:41:57.230 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-rd2.33790.log 2026-03-31T20:41:57.231 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61802.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61802.log.gz 2026-03-31T20:41:57.231 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32813.log 2026-03-31T20:41:57.231 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-rd2.33790.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-rd2.33790.log.gz 2026-03-31T20:41:57.231 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70933.log 2026-03-31T20:41:57.232 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32813.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32813.log.gz 2026-03-31T20:41:57.232 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28711.log 2026-03-31T20:41:57.232 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70933.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70933.log.gz 2026-03-31T20:41:57.233 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31028.log 2026-03-31T20:41:57.233 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28711.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28711.log.gz 2026-03-31T20:41:57.233 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78128.log 2026-03-31T20:41:57.233 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31028.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31028.log.gz 2026-03-31T20:41:57.234 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59968.log 2026-03-31T20:41:57.234 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78128.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78128.log.gz 2026-03-31T20:41:57.234 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35075.log 2026-03-31T20:41:57.234 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59968.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59968.log.gz 2026-03-31T20:41:57.235 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49175.log 2026-03-31T20:41:57.235 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35075.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35075.log.gz 2026-03-31T20:41:57.235 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78567.log 2026-03-31T20:41:57.236 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49175.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49175.log.gz 2026-03-31T20:41:57.236 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65907.log 2026-03-31T20:41:57.236 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78567.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78567.log.gz 2026-03-31T20:41:57.236 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57607.log 2026-03-31T20:41:57.237 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65907.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65907.log.gz 2026-03-31T20:41:57.237 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50501.log 2026-03-31T20:41:57.237 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57607.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57607.log.gz 2026-03-31T20:41:57.237 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55581.log 2026-03-31T20:41:57.238 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.50501.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50501.log.gz 2026-03-31T20:41:57.238 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44807.log 2026-03-31T20:41:57.238 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55581.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55581.log.gz 2026-03-31T20:41:57.238 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42044.log 2026-03-31T20:41:57.239 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44807.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44807.log.gz 2026-03-31T20:41:57.239 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60254.log 2026-03-31T20:41:57.239 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42044.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42044.log.gz 2026-03-31T20:41:57.240 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68164.log 2026-03-31T20:41:57.240 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60254.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60254.log.gz 2026-03-31T20:41:57.240 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67672.log 2026-03-31T20:41:57.241 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68164.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68164.log.gz 2026-03-31T20:41:57.241 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41278.log 2026-03-31T20:41:57.241 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67672.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67672.log.gz 2026-03-31T20:41:57.241 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53011.log 2026-03-31T20:41:57.242 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41278.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41278.log.gz 2026-03-31T20:41:57.242 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63565.log 2026-03-31T20:41:57.242 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53011.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53011.log.gz 2026-03-31T20:41:57.242 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57528.log 2026-03-31T20:41:57.243 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63565.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63565.log.gz 2026-03-31T20:41:57.243 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41354.log 2026-03-31T20:41:57.243 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57528.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57528.log.gz 2026-03-31T20:41:57.243 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36920.log 2026-03-31T20:41:57.244 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41354.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41354.log.gz 2026-03-31T20:41:57.244 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71354.log 2026-03-31T20:41:57.244 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36920.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36920.log.gz 2026-03-31T20:41:57.245 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49437.log 2026-03-31T20:41:57.245 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.71354.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71354.log.gz 2026-03-31T20:41:57.245 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37693.log 2026-03-31T20:41:57.245 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49437.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49437.log.gz 2026-03-31T20:41:57.246 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65175.log 2026-03-31T20:41:57.246 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37693.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37693.log.gz 2026-03-31T20:41:57.246 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67545.log 2026-03-31T20:41:57.246 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65175.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65175.log.gz 2026-03-31T20:41:57.247 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64858.log 2026-03-31T20:41:57.247 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67545.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67545.log.gz 2026-03-31T20:41:57.247 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30731.log 2026-03-31T20:41:57.248 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64858.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64858.log.gz 2026-03-31T20:41:57.248 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66875.log 2026-03-31T20:41:57.248 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30731.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30731.log.gz 2026-03-31T20:41:57.248 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66587.log 2026-03-31T20:41:57.249 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66875.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66875.log.gz 2026-03-31T20:41:57.249 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52149.log 2026-03-31T20:41:57.249 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66587.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66587.log.gz 2026-03-31T20:41:57.249 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60545.log 2026-03-31T20:41:57.250 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52149.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52149.log.gz 2026-03-31T20:41:57.250 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39750.log 2026-03-31T20:41:57.250 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60545.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60545.log.gz 2026-03-31T20:41:57.251 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-ro.33195.log 2026-03-31T20:41:57.251 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39750.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39750.log.gz 2026-03-31T20:41:57.251 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63609.log 2026-03-31T20:41:57.251 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-ro.33195.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-ro.33195.log.gz 2026-03-31T20:41:57.252 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68191.log 2026-03-31T20:41:57.252 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63609.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63609.log.gz 2026-03-31T20:41:57.252 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61397.log 2026-03-31T20:41:57.253 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68191.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68191.log.gz 2026-03-31T20:41:57.253 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58910.log 2026-03-31T20:41:57.253 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61397.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61397.log.gz 2026-03-31T20:41:57.253 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45583.log 2026-03-31T20:41:57.254 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58910.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58910.log.gz 2026-03-31T20:41:57.254 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48484.log 2026-03-31T20:41:57.254 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45583.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45583.log.gz 2026-03-31T20:41:57.254 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36006.log 2026-03-31T20:41:57.255 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48484.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48484.log.gz 2026-03-31T20:41:57.255 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62752.log 2026-03-31T20:41:57.255 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36006.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36006.log.gz 2026-03-31T20:41:57.255 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57633.log 2026-03-31T20:41:57.256 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62752.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62752.log.gz 2026-03-31T20:41:57.256 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39140.log 2026-03-31T20:41:57.256 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57633.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57633.log.gz 2026-03-31T20:41:57.257 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41994.log 2026-03-31T20:41:57.257 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39140.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39140.log.gz 2026-03-31T20:41:57.257 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69874.log 2026-03-31T20:41:57.258 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41994.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41994.log.gz 2026-03-31T20:41:57.258 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44222.log 2026-03-31T20:41:57.258 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69874.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69874.log.gz 2026-03-31T20:41:57.258 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43572.log 2026-03-31T20:41:57.259 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44222.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44222.log.gz 2026-03-31T20:41:57.259 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44735.log 2026-03-31T20:41:57.259 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43572.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43572.log.gz 2026-03-31T20:41:57.259 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39504.log 2026-03-31T20:41:57.260 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44735.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44735.log.gz 2026-03-31T20:41:57.260 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68860.log 2026-03-31T20:41:57.260 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39504.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39504.log.gz 2026-03-31T20:41:57.260 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66902.log 2026-03-31T20:41:57.261 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68860.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68860.log.gz 2026-03-31T20:41:57.261 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-rd.33718.log 2026-03-31T20:41:57.261 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66902.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66902.log.gz 2026-03-31T20:41:57.262 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47534.log 2026-03-31T20:41:57.262 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-rd.33718.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-rd.33718.log.gz 2026-03-31T20:41:57.262 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32507.log 2026-03-31T20:41:57.262 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47534.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47534.log.gz 2026-03-31T20:41:57.263 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67056.log 2026-03-31T20:41:57.263 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32507.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32507.log.gz 2026-03-31T20:41:57.263 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37897.log 2026-03-31T20:41:57.264 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67056.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67056.log.gz 2026-03-31T20:41:57.264 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55630.log 2026-03-31T20:41:57.264 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37897.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37897.log.gz 2026-03-31T20:41:57.264 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39308.log 2026-03-31T20:41:57.265 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55630.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55630.log.gz 2026-03-31T20:41:57.265 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47631.log 2026-03-31T20:41:57.265 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39308.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39308.log.gz 2026-03-31T20:41:57.265 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28429.log 2026-03-31T20:41:57.266 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47631.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47631.log.gz 2026-03-31T20:41:57.266 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43596.log 2026-03-31T20:41:57.266 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28429.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28429.log.gz 2026-03-31T20:41:57.267 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41713.log 2026-03-31T20:41:57.267 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43596.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43596.log.gz 2026-03-31T20:41:57.267 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52899.log 2026-03-31T20:41:57.267 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41713.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41713.log.gz 2026-03-31T20:41:57.268 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68297.log 2026-03-31T20:41:57.268 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52899.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52899.log.gz 2026-03-31T20:41:57.268 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35146.log 2026-03-31T20:41:57.268 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68297.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68297.log.gz 2026-03-31T20:41:57.269 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60521.log 2026-03-31T20:41:57.269 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35146.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35146.log.gz 2026-03-31T20:41:57.269 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58321.log 2026-03-31T20:41:57.270 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60521.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60521.log.gz 2026-03-31T20:41:57.270 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49968.log 2026-03-31T20:41:57.270 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58321.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58321.log.gz 2026-03-31T20:41:57.270 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69925.log 2026-03-31T20:41:57.271 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49968.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49968.log.gz 2026-03-31T20:41:57.271 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30488.log 2026-03-31T20:41:57.271 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69925.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69925.log.gz 2026-03-31T20:41:57.271 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32457.log 2026-03-31T20:41:57.272 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30488.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30488.log.gz 2026-03-31T20:41:57.272 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50737.log 2026-03-31T20:41:57.272 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32457.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32457.log.gz 2026-03-31T20:41:57.273 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70211.log 2026-03-31T20:41:57.273 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.50737.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50737.log.gz 2026-03-31T20:41:57.273 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46628.log 2026-03-31T20:41:57.273 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70211.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70211.log.gz 2026-03-31T20:41:57.274 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47896.log 2026-03-31T20:41:57.274 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46628.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46628.log.gz 2026-03-31T20:41:57.274 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78495.log 2026-03-31T20:41:57.275 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47896.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47896.log.gz 2026-03-31T20:41:57.275 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29031.log 2026-03-31T20:41:57.275 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78495.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78495.log.gz 2026-03-31T20:41:57.275 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52590.log 2026-03-31T20:41:57.276 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29031.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29031.log.gz 2026-03-31T20:41:57.276 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64452.log 2026-03-31T20:41:57.276 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52590.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52590.log.gz 2026-03-31T20:41:57.276 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67003.log 2026-03-31T20:41:57.277 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64452.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64452.log.gz 2026-03-31T20:41:57.277 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37651.log 2026-03-31T20:41:57.277 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67003.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67003.log.gz 2026-03-31T20:41:57.277 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47848.log 2026-03-31T20:41:57.278 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37651.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37651.log.gz 2026-03-31T20:41:57.278 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27914.log 2026-03-31T20:41:57.278 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47848.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47848.log.gz 2026-03-31T20:41:57.279 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58151.log 2026-03-31T20:41:57.279 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27914.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27914.log.gz 2026-03-31T20:41:57.279 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64095.log 2026-03-31T20:41:57.279 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58151.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58151.log.gz 2026-03-31T20:41:57.280 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37138.log 2026-03-31T20:41:57.280 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64095.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64095.log.gz 2026-03-31T20:41:57.280 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77727.log 2026-03-31T20:41:57.280 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37138.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37138.log.gz 2026-03-31T20:41:57.281 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28955.log 2026-03-31T20:41:57.281 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.77727.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77727.log.gz 2026-03-31T20:41:57.281 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37016.log 2026-03-31T20:41:57.282 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28955.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28955.log.gz 2026-03-31T20:41:57.282 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78805.log 2026-03-31T20:41:57.282 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37016.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37016.log.gz 2026-03-31T20:41:57.282 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42374.log 2026-03-31T20:41:57.283 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78805.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78805.log.gz 2026-03-31T20:41:57.283 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54957.log 2026-03-31T20:41:57.283 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42374.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42374.log.gz 2026-03-31T20:41:57.283 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51835.log 2026-03-31T20:41:57.284 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54957.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54957.log.gz 2026-03-31T20:41:57.284 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39702.log 2026-03-31T20:41:57.284 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.51835.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51835.log.gz 2026-03-31T20:41:57.285 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53129.log 2026-03-31T20:41:57.285 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39702.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39702.log.gz 2026-03-31T20:41:57.285 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31585.log 2026-03-31T20:41:57.285 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53129.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53129.log.gz 2026-03-31T20:41:57.286 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64124.log 2026-03-31T20:41:57.286 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31585.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31585.log.gz 2026-03-31T20:41:57.286 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58934.log 2026-03-31T20:41:57.287 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64124.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64124.log.gz 2026-03-31T20:41:57.287 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26679.log 2026-03-31T20:41:57.287 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58934.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58934.log.gz 2026-03-31T20:41:57.287 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56398.log 2026-03-31T20:41:57.288 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26679.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26679.log.gz 2026-03-31T20:41:57.288 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67110.log 2026-03-31T20:41:57.288 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56398.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56398.log.gz 2026-03-31T20:41:57.288 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53572.log 2026-03-31T20:41:57.289 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67110.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67110.log.gz 2026-03-31T20:41:57.289 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46463.log 2026-03-31T20:41:57.289 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53572.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53572.log.gz 2026-03-31T20:41:57.289 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78050.log 2026-03-31T20:41:57.290 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46463.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46463.log.gz 2026-03-31T20:41:57.290 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61647.log 2026-03-31T20:41:57.290 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78050.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78050.log.gz 2026-03-31T20:41:57.290 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45874.log 2026-03-31T20:41:57.291 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61647.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61647.log.gz 2026-03-31T20:41:57.291 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78831.log 2026-03-31T20:41:57.291 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45874.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45874.log.gz 2026-03-31T20:41:57.292 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66566.log 2026-03-31T20:41:57.292 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78831.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78831.log.gz 2026-03-31T20:41:57.292 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57554.log 2026-03-31T20:41:57.292 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66566.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66566.log.gz 2026-03-31T20:41:57.293 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49461.log 2026-03-31T20:41:57.293 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57554.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57554.log.gz 2026-03-31T20:41:57.293 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63398.log 2026-03-31T20:41:57.293 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49461.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49461.log.gz 2026-03-31T20:41:57.294 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30377.log 2026-03-31T20:41:57.294 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63398.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63398.log.gz 2026-03-31T20:41:57.294 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-osd.1.log 2026-03-31T20:41:57.295 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30377.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30377.log.gz 2026-03-31T20:41:57.295 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42783.log 2026-03-31T20:41:57.312 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-osd.1.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32334.log 2026-03-31T20:41:57.312 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42783.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42783.log.gz 2026-03-31T20:41:57.316 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49367.log 2026-03-31T20:41:57.316 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32334.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32334.log.gz 2026-03-31T20:41:57.328 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59850.log 2026-03-31T20:41:57.328 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49367.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49367.log.gz 2026-03-31T20:41:57.328 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30038.log 2026-03-31T20:41:57.332 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59850.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59850.log.gz 2026-03-31T20:41:57.340 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37630.log 2026-03-31T20:41:57.340 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30038.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30038.log.gz 2026-03-31T20:41:57.344 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48863.log 2026-03-31T20:41:57.344 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37630.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37630.log.gz 2026-03-31T20:41:57.352 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43979.log 2026-03-31T20:41:57.352 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48863.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48863.log.gz 2026-03-31T20:41:57.356 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51809.log 2026-03-31T20:41:57.356 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43979.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43979.log.gz 2026-03-31T20:41:57.364 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29376.log 2026-03-31T20:41:57.364 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.51809.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51809.log.gz 2026-03-31T20:41:57.372 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78935.log 2026-03-31T20:41:57.372 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29376.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29376.log.gz 2026-03-31T20:41:57.388 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65100.log 2026-03-31T20:41:57.388 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78935.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78935.log.gz 2026-03-31T20:41:57.391 INFO:teuthology.orchestra.run.vm03.stderr: 91.3% -- replaced with /var/log/ceph/ceph-mon.b.log.gz 2026-03-31T20:41:57.391 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-rw.33290.log 2026-03-31T20:41:57.391 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65100.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65100.log.gz 2026-03-31T20:41:57.392 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35485.log 2026-03-31T20:41:57.392 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-rw.33290.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-rw.33290.log.gz 2026-03-31T20:41:57.392 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61572.log 2026-03-31T20:41:57.392 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35485.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35485.log.gz 2026-03-31T20:41:57.392 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45800.log 2026-03-31T20:41:57.393 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61572.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61572.log.gz 2026-03-31T20:41:57.393 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40665.log 2026-03-31T20:41:57.393 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45800.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45800.log.gz 2026-03-31T20:41:57.393 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31097.log 2026-03-31T20:41:57.394 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40665.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40665.log.gz 2026-03-31T20:41:57.394 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68882.log 2026-03-31T20:41:57.394 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31097.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31097.log.gz 2026-03-31T20:41:57.394 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78753.log 2026-03-31T20:41:57.394 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68882.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68882.log.gz 2026-03-31T20:41:57.395 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31149.log 2026-03-31T20:41:57.395 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78753.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78753.log.gz 2026-03-31T20:41:57.395 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35956.log 2026-03-31T20:41:57.395 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31149.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31149.log.gz 2026-03-31T20:41:57.396 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45206.log 2026-03-31T20:41:57.396 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35956.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35956.log.gz 2026-03-31T20:41:57.396 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45675.log 2026-03-31T20:41:57.396 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45206.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45206.log.gz 2026-03-31T20:41:57.396 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33864.log 2026-03-31T20:41:57.397 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45675.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45675.log.gz 2026-03-31T20:41:57.397 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28617.log 2026-03-31T20:41:57.397 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.33864.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33864.log.gz 2026-03-31T20:41:57.397 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58610.log 2026-03-31T20:41:57.398 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28617.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28617.log.gz 2026-03-31T20:41:57.398 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41228.log 2026-03-31T20:41:57.398 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58610.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58610.log.gz 2026-03-31T20:41:57.398 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38352.log 2026-03-31T20:41:57.398 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41228.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41228.log.gz 2026-03-31T20:41:57.399 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48271.log 2026-03-31T20:41:57.399 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38352.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38352.log.gz 2026-03-31T20:41:57.399 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69778.log 2026-03-31T20:41:57.399 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48271.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48271.log.gz 2026-03-31T20:41:57.399 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57091.log 2026-03-31T20:41:57.400 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69778.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69778.log.gz 2026-03-31T20:41:57.400 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36322.log 2026-03-31T20:41:57.400 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57091.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57091.log.gz 2026-03-31T20:41:57.400 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51600.log 2026-03-31T20:41:57.401 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36322.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36322.log.gz 2026-03-31T20:41:57.401 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71080.log 2026-03-31T20:41:57.401 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.51600.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51600.log.gz 2026-03-31T20:41:57.401 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49797.log 2026-03-31T20:41:57.401 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.71080.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71080.log.gz 2026-03-31T20:41:57.402 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41840.log 2026-03-31T20:41:57.402 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49797.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49797.log.gz 2026-03-31T20:41:57.402 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56975.log 2026-03-31T20:41:57.402 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41840.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41840.log.gz 2026-03-31T20:41:57.403 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77983.log 2026-03-31T20:41:57.403 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56975.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56975.log.gz 2026-03-31T20:41:57.403 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69197.log 2026-03-31T20:41:57.403 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.77983.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77983.log.gz 2026-03-31T20:41:57.403 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57857.log 2026-03-31T20:41:57.404 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69197.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69197.log.gz 2026-03-31T20:41:57.404 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60423.log 2026-03-31T20:41:57.404 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57857.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57857.log.gz 2026-03-31T20:41:57.404 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42630.log 2026-03-31T20:41:57.405 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60423.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60423.log.gz 2026-03-31T20:41:57.405 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63281.log 2026-03-31T20:41:57.405 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42630.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42630.log.gz 2026-03-31T20:41:57.405 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46555.log 2026-03-31T20:41:57.406 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63281.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63281.log.gz 2026-03-31T20:41:57.406 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-ro.33053.log 2026-03-31T20:41:57.406 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46555.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46555.log.gz 2026-03-31T20:41:57.406 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67369.log 2026-03-31T20:41:57.406 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-ro.33053.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-ro.33053.log.gz 2026-03-31T20:41:57.407 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50372.log 2026-03-31T20:41:57.407 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67369.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67369.log.gz 2026-03-31T20:41:57.407 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31865.log 2026-03-31T20:41:57.407 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.50372.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50372.log.gz 2026-03-31T20:41:57.408 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48046.log 2026-03-31T20:41:57.408 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31865.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31865.log.gz 2026-03-31T20:41:57.408 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78429.log 2026-03-31T20:41:57.408 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48046.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48046.log.gz 2026-03-31T20:41:57.408 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59559.log 2026-03-31T20:41:57.409 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78429.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78429.log.gz 2026-03-31T20:41:57.409 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44901.log 2026-03-31T20:41:57.409 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59559.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59559.log.gz 2026-03-31T20:41:57.409 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61522.log 2026-03-31T20:41:57.410 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44901.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44901.log.gz 2026-03-31T20:41:57.410 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29446.log 2026-03-31T20:41:57.410 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61522.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61522.log.gz 2026-03-31T20:41:57.410 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67263.log 2026-03-31T20:41:57.410 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29446.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29446.log.gz 2026-03-31T20:41:57.411 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43815.log 2026-03-31T20:41:57.411 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67263.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67263.log.gz 2026-03-31T20:41:57.411 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51234.log 2026-03-31T20:41:57.411 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43815.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43815.log.gz 2026-03-31T20:41:57.411 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64383.log 2026-03-31T20:41:57.412 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.51234.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51234.log.gz 2026-03-31T20:41:57.412 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60900.log 2026-03-31T20:41:57.412 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64383.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64383.log.gz 2026-03-31T20:41:57.412 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48582.log 2026-03-31T20:41:57.413 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60900.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60900.log.gz 2026-03-31T20:41:57.413 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32382.log 2026-03-31T20:41:57.413 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48582.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48582.log.gz 2026-03-31T20:41:57.413 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55508.log 2026-03-31T20:41:57.413 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32382.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32382.log.gz 2026-03-31T20:41:57.414 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30512.log 2026-03-31T20:41:57.414 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55508.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55508.log.gz 2026-03-31T20:41:57.414 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56374.log 2026-03-31T20:41:57.414 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30512.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30512.log.gz 2026-03-31T20:41:57.415 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68559.log 2026-03-31T20:41:57.415 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56374.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56374.log.gz 2026-03-31T20:41:57.415 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29641.log 2026-03-31T20:41:57.415 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68559.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68559.log.gz 2026-03-31T20:41:57.415 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70234.log 2026-03-31T20:41:57.416 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29641.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29641.log.gz 2026-03-31T20:41:57.416 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32137.log 2026-03-31T20:41:57.416 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70234.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70234.log.gz 2026-03-31T20:41:57.417 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55243.log 2026-03-31T20:41:57.417 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32137.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32137.log.gz 2026-03-31T20:41:57.417 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61447.log 2026-03-31T20:41:57.417 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55243.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55243.log.gz 2026-03-31T20:41:57.417 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42250.log 2026-03-31T20:41:57.418 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61447.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61447.log.gz 2026-03-31T20:41:57.418 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40050.log 2026-03-31T20:41:57.418 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42250.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42250.log.gz 2026-03-31T20:41:57.418 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66119.log 2026-03-31T20:41:57.418 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40050.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40050.log.gz 2026-03-31T20:41:57.419 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45504.log 2026-03-31T20:41:57.419 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66119.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66119.log.gz 2026-03-31T20:41:57.419 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39627.log 2026-03-31T20:41:57.419 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45504.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45504.log.gz 2026-03-31T20:41:57.420 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48680.log 2026-03-31T20:41:57.420 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39627.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39627.log.gz 2026-03-31T20:41:57.420 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45279.log 2026-03-31T20:41:57.420 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48680.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48680.log.gz 2026-03-31T20:41:57.420 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34630.log 2026-03-31T20:41:57.421 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45279.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45279.log.gz 2026-03-31T20:41:57.421 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69609.log 2026-03-31T20:41:57.421 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34630.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34630.log.gz 2026-03-31T20:41:57.421 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69683.log 2026-03-31T20:41:57.422 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69609.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69609.log.gz 2026-03-31T20:41:57.422 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32358.log 2026-03-31T20:41:57.422 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69683.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69683.log.gz 2026-03-31T20:41:57.422 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50633.log 2026-03-31T20:41:57.422 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32358.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32358.log.gz 2026-03-31T20:41:57.423 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60756.log 2026-03-31T20:41:57.423 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.50633.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50633.log.gz 2026-03-31T20:41:57.423 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56519.log 2026-03-31T20:41:57.423 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60756.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60756.log.gz 2026-03-31T20:41:57.424 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26872.log 2026-03-31T20:41:57.424 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56519.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56519.log.gz 2026-03-31T20:41:57.424 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36776.log 2026-03-31T20:41:57.424 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26872.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26872.log.gz 2026-03-31T20:41:57.424 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52688.log 2026-03-31T20:41:57.425 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36776.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36776.log.gz 2026-03-31T20:41:57.425 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78727.log 2026-03-31T20:41:57.425 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52688.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52688.log.gz 2026-03-31T20:41:57.425 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46362.log 2026-03-31T20:41:57.425 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78727.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78727.log.gz 2026-03-31T20:41:57.426 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27169.log 2026-03-31T20:41:57.426 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46362.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46362.log.gz 2026-03-31T20:41:57.426 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71235.log 2026-03-31T20:41:57.426 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27169.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27169.log.gz 2026-03-31T20:41:57.427 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30980.log 2026-03-31T20:41:57.427 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.71235.log: 39.5% -- replaced with /var/log/ceph/ceph-client.admin.71235.log.gz 2026-03-31T20:41:57.427 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56830.log 2026-03-31T20:41:57.427 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30980.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30980.log.gz 2026-03-31T20:41:57.428 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43438.log 2026-03-31T20:41:57.428 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56830.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56830.log.gz 2026-03-31T20:41:57.428 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62985.log 2026-03-31T20:41:57.428 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43438.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43438.log.gz 2026-03-31T20:41:57.428 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47580.log 2026-03-31T20:41:57.429 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62985.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62985.log.gz 2026-03-31T20:41:57.429 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44273.log 2026-03-31T20:41:57.429 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47580.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47580.log.gz 2026-03-31T20:41:57.429 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28898.log 2026-03-31T20:41:57.430 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44273.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44273.log.gz 2026-03-31T20:41:57.430 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70141.log 2026-03-31T20:41:57.430 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28898.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28898.log.gz 2026-03-31T20:41:57.430 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50139.log 2026-03-31T20:41:57.430 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70141.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70141.log.gz 2026-03-31T20:41:57.431 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55077.log 2026-03-31T20:41:57.431 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.50139.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50139.log.gz 2026-03-31T20:41:57.431 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25731.log 2026-03-31T20:41:57.431 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55077.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55077.log.gz 2026-03-31T20:41:57.432 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65765.log 2026-03-31T20:41:57.432 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.25731.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25731.log.gz 2026-03-31T20:41:57.432 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40843.log 2026-03-31T20:41:57.432 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65765.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65765.log.gz 2026-03-31T20:41:57.432 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45058.log 2026-03-31T20:41:57.433 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40843.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40843.log.gz 2026-03-31T20:41:57.433 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33990.log 2026-03-31T20:41:57.433 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45058.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45058.log.gz 2026-03-31T20:41:57.433 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44419.log 2026-03-31T20:41:57.433 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.33990.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33990.log.gz 2026-03-31T20:41:57.434 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56301.log 2026-03-31T20:41:57.434 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44419.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44419.log.gz 2026-03-31T20:41:57.434 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46727.log 2026-03-31T20:41:57.434 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56301.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56301.log.gz 2026-03-31T20:41:57.435 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40253.log 2026-03-31T20:41:57.435 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46727.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46727.log.gz 2026-03-31T20:41:57.435 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60830.log 2026-03-31T20:41:57.435 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40253.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40253.log.gz 2026-03-31T20:41:57.435 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69850.log 2026-03-31T20:41:57.436 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60830.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60830.log.gz 2026-03-31T20:41:57.436 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58834.log 2026-03-31T20:41:57.436 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69850.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69850.log.gz 2026-03-31T20:41:57.436 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31204.log 2026-03-31T20:41:57.436 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58834.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58834.log.gz 2026-03-31T20:41:57.437 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49749.log 2026-03-31T20:41:57.437 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31204.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31204.log.gz 2026-03-31T20:41:57.437 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26998.log 2026-03-31T20:41:57.437 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49749.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49749.log.gz 2026-03-31T20:41:57.438 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47968.log 2026-03-31T20:41:57.438 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26998.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26998.log.gz 2026-03-31T20:41:57.438 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56591.log 2026-03-31T20:41:57.438 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47968.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47968.log.gz 2026-03-31T20:41:57.438 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59293.log 2026-03-31T20:41:57.439 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56591.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56591.log.gz 2026-03-31T20:41:57.439 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47173.log 2026-03-31T20:41:57.439 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59293.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59293.log.gz 2026-03-31T20:41:57.439 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58812.log 2026-03-31T20:41:57.440 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47173.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47173.log.gz 2026-03-31T20:41:57.440 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43548.log 2026-03-31T20:41:57.440 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58812.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58812.log.gz 2026-03-31T20:41:57.440 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55028.log 2026-03-31T20:41:57.440 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43548.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43548.log.gz 2026-03-31T20:41:57.441 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55219.log 2026-03-31T20:41:57.441 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55028.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55028.log.gz 2026-03-31T20:41:57.441 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63679.log 2026-03-31T20:41:57.441 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55219.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55219.log.gz 2026-03-31T20:41:57.441 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34164.log 2026-03-31T20:41:57.442 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63679.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63679.log.gz 2026-03-31T20:41:57.442 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64329.log 2026-03-31T20:41:57.442 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34164.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34164.log.gz 2026-03-31T20:41:57.442 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46751.log 2026-03-31T20:41:57.443 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64329.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64329.log.gz 2026-03-31T20:41:57.443 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44249.log 2026-03-31T20:41:57.443 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46751.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46751.log.gz 2026-03-31T20:41:57.443 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54212.log 2026-03-31T20:41:57.443 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44249.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44249.log.gz 2026-03-31T20:41:57.444 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71185.log 2026-03-31T20:41:57.444 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54212.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54212.log.gz 2026-03-31T20:41:57.444 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50017.log 2026-03-31T20:41:57.445 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.71185.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71185.log.gz 2026-03-31T20:41:57.445 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69658.log 2026-03-31T20:41:57.445 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.50017.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50017.log.gz 2026-03-31T20:41:57.445 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63657.log 2026-03-31T20:41:57.446 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69658.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69658.log.gz 2026-03-31T20:41:57.446 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45898.log 2026-03-31T20:41:57.446 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63657.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63657.log.gz 2026-03-31T20:41:57.446 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26556.log 2026-03-31T20:41:57.446 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45898.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45898.log.gz 2026-03-31T20:41:57.447 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62818.log 2026-03-31T20:41:57.447 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26556.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26556.log.gz 2026-03-31T20:41:57.447 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27862.log 2026-03-31T20:41:57.447 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62818.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62818.log.gz 2026-03-31T20:41:57.447 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50424.log 2026-03-31T20:41:57.448 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27862.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27862.log.gz 2026-03-31T20:41:57.448 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56422.log 2026-03-31T20:41:57.448 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.50424.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50424.log.gz 2026-03-31T20:41:57.448 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59583.log 2026-03-31T20:41:57.449 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56422.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56422.log.gz 2026-03-31T20:41:57.449 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43644.log 2026-03-31T20:41:57.449 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59583.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59583.log.gz 2026-03-31T20:41:57.449 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28571.log 2026-03-31T20:41:57.449 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43644.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43644.log.gz 2026-03-31T20:41:57.450 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55532.log 2026-03-31T20:41:57.450 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28571.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28571.log.gz 2026-03-31T20:41:57.450 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31714.log 2026-03-31T20:41:57.450 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55532.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55532.log.gz 2026-03-31T20:41:57.450 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37588.log 2026-03-31T20:41:57.451 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31714.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31714.log.gz 2026-03-31T20:41:57.451 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29179.log 2026-03-31T20:41:57.451 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37588.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37588.log.gz 2026-03-31T20:41:57.451 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62774.log 2026-03-31T20:41:57.452 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29179.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29179.log.gz 2026-03-31T20:41:57.452 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58253.log 2026-03-31T20:41:57.452 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62774.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62774.log.gz 2026-03-31T20:41:57.453 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43524.log 2026-03-31T20:41:57.453 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58253.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58253.log.gz 2026-03-31T20:41:57.453 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65931.log 2026-03-31T20:41:57.453 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43524.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43524.log.gz 2026-03-31T20:41:57.453 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49653.log 2026-03-31T20:41:57.454 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65931.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65931.log.gz 2026-03-31T20:41:57.454 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40817.log 2026-03-31T20:41:57.454 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49653.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49653.log.gz 2026-03-31T20:41:57.454 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45558.log 2026-03-31T20:41:57.455 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40817.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40817.log.gz 2026-03-31T20:41:57.455 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78208.log 2026-03-31T20:41:57.455 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45558.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45558.log.gz 2026-03-31T20:41:57.455 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65955.log 2026-03-31T20:41:57.455 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78208.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78208.log.gz 2026-03-31T20:41:57.456 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32788.log 2026-03-31T20:41:57.456 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65955.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65955.log.gz 2026-03-31T20:41:57.456 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43885.log 2026-03-31T20:41:57.456 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32788.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32788.log.gz 2026-03-31T20:41:57.456 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40101.log 2026-03-31T20:41:57.457 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43885.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43885.log.gz 2026-03-31T20:41:57.457 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46256.log 2026-03-31T20:41:57.457 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40101.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40101.log.gz 2026-03-31T20:41:57.457 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45605.log 2026-03-31T20:41:57.458 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46256.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46256.log.gz 2026-03-31T20:41:57.458 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71472.log 2026-03-31T20:41:57.458 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45605.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45605.log.gz 2026-03-31T20:41:57.458 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38837.log 2026-03-31T20:41:57.458 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.71472.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71472.log.gz 2026-03-31T20:41:57.459 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46652.log 2026-03-31T20:41:57.459 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38837.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38837.log.gz 2026-03-31T20:41:57.459 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55003.log 2026-03-31T20:41:57.459 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46652.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46652.log.gz 2026-03-31T20:41:57.460 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66302.log 2026-03-31T20:41:57.460 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55003.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55003.log.gz 2026-03-31T20:41:57.460 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68487.log 2026-03-31T20:41:57.461 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66302.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66302.log.gz 2026-03-31T20:41:57.461 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45230.log 2026-03-31T20:41:57.461 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68487.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68487.log.gz 2026-03-31T20:41:57.461 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52252.log 2026-03-31T20:41:57.462 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45230.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45230.log.gz 2026-03-31T20:41:57.462 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56277.log 2026-03-31T20:41:57.462 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52252.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52252.log.gz 2026-03-31T20:41:57.462 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66979.log 2026-03-31T20:41:57.462 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56277.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56277.log.gz 2026-03-31T20:41:57.463 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47363.log 2026-03-31T20:41:57.463 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66979.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66979.log.gz 2026-03-31T20:41:57.463 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54568.log 2026-03-31T20:41:57.463 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47363.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47363.log.gz 2026-03-31T20:41:57.464 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61122.log 2026-03-31T20:41:57.464 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54568.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54568.log.gz 2026-03-31T20:41:57.464 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58983.log 2026-03-31T20:41:57.464 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61122.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61122.log.gz 2026-03-31T20:41:57.464 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31994.log 2026-03-31T20:41:57.465 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58983.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58983.log.gz 2026-03-31T20:41:57.465 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25828.log 2026-03-31T20:41:57.465 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31994.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31994.log.gz 2026-03-31T20:41:57.465 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78701.log 2026-03-31T20:41:57.465 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.25828.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25828.log.gz 2026-03-31T20:41:57.466 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36944.log 2026-03-31T20:41:57.466 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78701.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78701.log.gz 2026-03-31T20:41:57.466 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62532.log 2026-03-31T20:41:57.466 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36944.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36944.log.gz 2026-03-31T20:41:57.467 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48172.log 2026-03-31T20:41:57.467 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62532.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62532.log.gz 2026-03-31T20:41:57.467 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67595.log 2026-03-31T20:41:57.467 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48172.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48172.log.gz 2026-03-31T20:41:57.467 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61022.log 2026-03-31T20:41:57.468 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67595.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67595.log.gz 2026-03-31T20:41:57.468 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60878.log 2026-03-31T20:41:57.468 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61022.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61022.log.gz 2026-03-31T20:41:57.468 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46067.log 2026-03-31T20:41:57.469 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60878.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60878.log.gz 2026-03-31T20:41:57.469 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25776.log 2026-03-31T20:41:57.469 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46067.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46067.log.gz 2026-03-31T20:41:57.469 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.23049.log 2026-03-31T20:41:57.470 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.25776.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25776.log.gz 2026-03-31T20:41:57.470 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70739.log 2026-03-31T20:41:57.470 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.23049.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.23049.log.gz 2026-03-31T20:41:57.470 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49413.log 2026-03-31T20:41:57.470 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70739.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70739.log.gz 2026-03-31T20:41:57.471 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38515.log 2026-03-31T20:41:57.471 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49413.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49413.log.gz 2026-03-31T20:41:57.471 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36347.log 2026-03-31T20:41:57.471 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38515.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38515.log.gz 2026-03-31T20:41:57.472 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42733.log 2026-03-31T20:41:57.472 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36347.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36347.log.gz 2026-03-31T20:41:57.472 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49605.log 2026-03-31T20:41:57.472 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42733.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42733.log.gz 2026-03-31T20:41:57.472 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40177.log 2026-03-31T20:41:57.473 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49605.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49605.log.gz 2026-03-31T20:41:57.473 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32483.log 2026-03-31T20:41:57.473 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40177.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40177.log.gz 2026-03-31T20:41:57.473 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58422.log 2026-03-31T20:41:57.473 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32483.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32483.log.gz 2026-03-31T20:41:57.474 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30398.log 2026-03-31T20:41:57.474 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58422.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58422.log.gz 2026-03-31T20:41:57.474 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.22959.log 2026-03-31T20:41:57.474 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30398.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30398.log.gz 2026-03-31T20:41:57.475 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-rd.33578.log 2026-03-31T20:41:57.475 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.22959.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.22959.log.gz 2026-03-31T20:41:57.475 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56951.log 2026-03-31T20:41:57.475 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-rd.33578.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-rd.33578.log.gz 2026-03-31T20:41:57.475 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36728.log 2026-03-31T20:41:57.476 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56951.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56951.log.gz 2026-03-31T20:41:57.476 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58299.log 2026-03-31T20:41:57.476 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36728.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36728.log.gz 2026-03-31T20:41:57.477 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78234.log 2026-03-31T20:41:57.477 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58299.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58299.log.gz 2026-03-31T20:41:57.477 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26433.log 2026-03-31T20:41:57.477 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78234.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78234.log.gz 2026-03-31T20:41:57.477 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56999.log 2026-03-31T20:41:57.478 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26433.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26433.log.gz 2026-03-31T20:41:57.478 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40613.log 2026-03-31T20:41:57.478 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56999.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56999.log.gz 2026-03-31T20:41:57.478 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49509.log 2026-03-31T20:41:57.478 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40613.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40613.log.gz 2026-03-31T20:41:57.479 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27193.log 2026-03-31T20:41:57.479 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49509.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49509.log.gz 2026-03-31T20:41:57.479 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63866.log 2026-03-31T20:41:57.479 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27193.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27193.log.gz 2026-03-31T20:41:57.480 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53928.log 2026-03-31T20:41:57.480 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63866.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63866.log.gz 2026-03-31T20:41:57.480 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55126.log 2026-03-31T20:41:57.480 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53928.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53928.log.gz 2026-03-31T20:41:57.480 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45850.log 2026-03-31T20:41:57.481 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55126.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55126.log.gz 2026-03-31T20:41:57.481 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34284.log 2026-03-31T20:41:57.481 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45850.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45850.log.gz 2026-03-31T20:41:57.481 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40793.log 2026-03-31T20:41:57.482 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34284.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34284.log.gz 2026-03-31T20:41:57.482 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61497.log 2026-03-31T20:41:57.482 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40793.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40793.log.gz 2026-03-31T20:41:57.482 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64242.log 2026-03-31T20:41:57.482 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61497.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61497.log.gz 2026-03-31T20:41:57.483 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41025.log 2026-03-31T20:41:57.483 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64242.log: 66.7% -- replaced with /var/log/ceph/ceph-client.admin.64242.log.gz 2026-03-31T20:41:57.483 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52071.log 2026-03-31T20:41:57.483 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41025.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41025.log.gz 2026-03-31T20:41:57.484 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50999.log 2026-03-31T20:41:57.484 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52071.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52071.log.gz 2026-03-31T20:41:57.484 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66024.log 2026-03-31T20:41:57.484 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.50999.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50999.log.gz 2026-03-31T20:41:57.485 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36394.log 2026-03-31T20:41:57.485 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66024.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66024.log.gz 2026-03-31T20:41:57.485 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40716.log 2026-03-31T20:41:57.485 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36394.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36394.log.gz 2026-03-31T20:41:57.485 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67820.log 2026-03-31T20:41:57.486 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40716.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40716.log.gz 2026-03-31T20:41:57.486 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-ro.33125.log 2026-03-31T20:41:57.486 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67820.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67820.log.gz 2026-03-31T20:41:57.486 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53683.log 2026-03-31T20:41:57.487 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-ro.33125.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-ro.33125.log.gz 2026-03-31T20:41:57.487 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52785.log 2026-03-31T20:41:57.487 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53683.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53683.log.gz 2026-03-31T20:41:57.487 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42121.log 2026-03-31T20:41:57.487 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52785.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52785.log.gz 2026-03-31T20:41:57.488 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32066.log 2026-03-31T20:41:57.488 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42121.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42121.log.gz 2026-03-31T20:41:57.488 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70958.log 2026-03-31T20:41:57.488 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32066.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32066.log.gz 2026-03-31T20:41:57.489 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61097.log 2026-03-31T20:41:57.489 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70958.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70958.log.gz 2026-03-31T20:41:57.489 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55150.log 2026-03-31T20:41:57.489 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61097.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61097.log.gz 2026-03-31T20:41:57.489 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68605.log 2026-03-31T20:41:57.490 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55150.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55150.log.gz 2026-03-31T20:41:57.490 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27469.log 2026-03-31T20:41:57.490 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68605.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68605.log.gz 2026-03-31T20:41:57.490 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52830.log 2026-03-31T20:41:57.491 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27469.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27469.log.gz 2026-03-31T20:41:57.491 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62312.log 2026-03-31T20:41:57.491 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52830.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52830.log.gz 2026-03-31T20:41:57.491 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66141.log 2026-03-31T20:41:57.491 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62312.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62312.log.gz 2026-03-31T20:41:57.492 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39849.log 2026-03-31T20:41:57.492 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66141.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66141.log.gz 2026-03-31T20:41:57.492 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38591.log 2026-03-31T20:41:57.492 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39849.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39849.log.gz 2026-03-31T20:41:57.492 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46797.log 2026-03-31T20:41:57.493 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38591.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38591.log.gz 2026-03-31T20:41:57.493 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78290.log 2026-03-31T20:41:57.493 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46797.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46797.log.gz 2026-03-31T20:41:57.493 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69414.log 2026-03-31T20:41:57.494 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78290.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78290.log.gz 2026-03-31T20:41:57.494 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43014.log 2026-03-31T20:41:57.494 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69414.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69414.log.gz 2026-03-31T20:41:57.494 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28829.log 2026-03-31T20:41:57.494 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43014.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43014.log.gz 2026-03-31T20:41:57.495 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57783.log 2026-03-31T20:41:57.495 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28829.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28829.log.gz 2026-03-31T20:41:57.495 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37186.log 2026-03-31T20:41:57.495 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57783.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57783.log.gz 2026-03-31T20:41:57.496 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53241.log 2026-03-31T20:41:57.496 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37186.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37186.log.gz 2026-03-31T20:41:57.496 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25966.log 2026-03-31T20:41:57.496 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53241.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53241.log.gz 2026-03-31T20:41:57.497 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34140.log 2026-03-31T20:41:57.497 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.25966.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25966.log.gz 2026-03-31T20:41:57.497 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66164.log 2026-03-31T20:41:57.497 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34140.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34140.log.gz 2026-03-31T20:41:57.497 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29834.log 2026-03-31T20:41:57.498 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66164.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66164.log.gz 2026-03-31T20:41:57.498 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45403.log 2026-03-31T20:41:57.498 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29834.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29834.log.gz 2026-03-31T20:41:57.498 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70304.log 2026-03-31T20:41:57.499 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45403.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45403.log.gz 2026-03-31T20:41:57.499 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34332.log 2026-03-31T20:41:57.499 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70304.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70304.log.gz 2026-03-31T20:41:57.499 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70400.log 2026-03-31T20:41:57.499 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34332.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34332.log.gz 2026-03-31T20:41:57.500 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45727.log 2026-03-31T20:41:57.500 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70400.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70400.log.gz 2026-03-31T20:41:57.500 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68812.log 2026-03-31T20:41:57.500 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45727.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45727.log.gz 2026-03-31T20:41:57.501 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39212.log 2026-03-31T20:41:57.501 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68812.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68812.log.gz 2026-03-31T20:41:57.501 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69561.log 2026-03-31T20:41:57.501 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39212.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39212.log.gz 2026-03-31T20:41:57.501 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49701.log 2026-03-31T20:41:57.502 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69561.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69561.log.gz 2026-03-31T20:41:57.502 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50091.log 2026-03-31T20:41:57.502 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49701.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49701.log.gz 2026-03-31T20:41:57.502 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37210.log 2026-03-31T20:41:57.502 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.50091.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50091.log.gz 2026-03-31T20:41:57.503 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63033.log 2026-03-31T20:41:57.503 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37210.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37210.log.gz 2026-03-31T20:41:57.503 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68017.log 2026-03-31T20:41:57.503 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63033.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63033.log.gz 2026-03-31T20:41:57.504 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36442.log 2026-03-31T20:41:57.504 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68017.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68017.log.gz 2026-03-31T20:41:57.504 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46507.log 2026-03-31T20:41:57.504 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36442.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36442.log.gz 2026-03-31T20:41:57.505 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42862.log 2026-03-31T20:41:57.505 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46507.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46507.log.gz 2026-03-31T20:41:57.505 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37714.log 2026-03-31T20:41:57.505 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42862.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42862.log.gz 2026-03-31T20:41:57.506 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58883.log 2026-03-31T20:41:57.506 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37714.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37714.log.gz 2026-03-31T20:41:57.506 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47775.log 2026-03-31T20:41:57.506 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58883.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58883.log.gz 2026-03-31T20:41:57.506 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78649.log 2026-03-31T20:41:57.507 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47775.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47775.log.gz 2026-03-31T20:41:57.507 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56713.log 2026-03-31T20:41:57.507 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78649.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78649.log.gz 2026-03-31T20:41:57.507 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67844.log 2026-03-31T20:41:57.507 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56713.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56713.log.gz 2026-03-31T20:41:57.508 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49127.log 2026-03-31T20:41:57.508 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67844.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67844.log.gz 2026-03-31T20:41:57.508 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54837.log 2026-03-31T20:41:57.508 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49127.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49127.log.gz 2026-03-31T20:41:57.509 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-rd.33602.log 2026-03-31T20:41:57.509 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54837.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54837.log.gz 2026-03-31T20:41:57.509 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51625.log 2026-03-31T20:41:57.509 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-rd.33602.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-rd.33602.log.gz 2026-03-31T20:41:57.509 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47799.log 2026-03-31T20:41:57.510 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.51625.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51625.log.gz 2026-03-31T20:41:57.510 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42939.log 2026-03-31T20:41:57.510 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47799.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47799.log.gz 2026-03-31T20:41:57.510 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35883.log 2026-03-31T20:41:57.510 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42939.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42939.log.gz 2026-03-31T20:41:57.511 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78857.log 2026-03-31T20:41:57.511 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35883.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35883.log.gz 2026-03-31T20:41:57.511 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41103.log 2026-03-31T20:41:57.511 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78857.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78857.log.gz 2026-03-31T20:41:57.512 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62620.log 2026-03-31T20:41:57.512 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41103.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41103.log.gz 2026-03-31T20:41:57.512 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31259.log 2026-03-31T20:41:57.512 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62620.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62620.log.gz 2026-03-31T20:41:57.512 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62963.log 2026-03-31T20:41:57.513 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31259.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31259.log.gz 2026-03-31T20:41:57.513 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57832.log 2026-03-31T20:41:57.513 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62963.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62963.log.gz 2026-03-31T20:41:57.513 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25706.log 2026-03-31T20:41:57.514 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57832.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57832.log.gz 2026-03-31T20:41:57.514 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30332.log 2026-03-31T20:41:57.514 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.25706.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25706.log.gz 2026-03-31T20:41:57.514 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77776.log 2026-03-31T20:41:57.514 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30332.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30332.log.gz 2026-03-31T20:41:57.515 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36611.log 2026-03-31T20:41:57.515 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.77776.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77776.log.gz 2026-03-31T20:41:57.515 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28212.log 2026-03-31T20:41:57.515 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36611.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36611.log.gz 2026-03-31T20:41:57.515 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36968.log 2026-03-31T20:41:57.516 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28212.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28212.log.gz 2026-03-31T20:41:57.516 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59031.log 2026-03-31T20:41:57.516 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36968.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36968.log.gz 2026-03-31T20:41:57.516 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62422.log 2026-03-31T20:41:57.517 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59031.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59031.log.gz 2026-03-31T20:41:57.517 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59055.log 2026-03-31T20:41:57.517 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62422.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62422.log.gz 2026-03-31T20:41:57.517 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37756.log 2026-03-31T20:41:57.517 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59055.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59055.log.gz 2026-03-31T20:41:57.518 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37064.log 2026-03-31T20:41:57.518 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37756.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37756.log.gz 2026-03-31T20:41:57.518 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61938.log 2026-03-31T20:41:57.518 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37064.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37064.log.gz 2026-03-31T20:41:57.519 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34935.log 2026-03-31T20:41:57.519 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61938.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61938.log.gz 2026-03-31T20:41:57.519 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61968.log 2026-03-31T20:41:57.519 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34935.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34935.log.gz 2026-03-31T20:41:57.519 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34983.log 2026-03-31T20:41:57.520 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61968.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61968.log.gz 2026-03-31T20:41:57.520 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41049.log 2026-03-31T20:41:57.520 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34983.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34983.log.gz 2026-03-31T20:41:57.521 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37672.log 2026-03-31T20:41:57.521 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41049.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41049.log.gz 2026-03-31T20:41:57.521 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71582.log 2026-03-31T20:41:57.521 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37672.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37672.log.gz 2026-03-31T20:41:57.521 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28054.log 2026-03-31T20:41:57.522 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.71582.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71582.log.gz 2026-03-31T20:41:57.522 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35411.log 2026-03-31T20:41:57.522 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28054.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28054.log.gz 2026-03-31T20:41:57.522 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49894.log 2026-03-31T20:41:57.523 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35411.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35411.log.gz 2026-03-31T20:41:57.523 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42526.log 2026-03-31T20:41:57.523 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49894.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49894.log.gz 2026-03-31T20:41:57.523 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48221.log 2026-03-31T20:41:57.523 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42526.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42526.log.gz 2026-03-31T20:41:57.524 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55772.log 2026-03-31T20:41:57.524 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48221.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48221.log.gz 2026-03-31T20:41:57.524 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66798.log 2026-03-31T20:41:57.524 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55772.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55772.log.gz 2026-03-31T20:41:57.525 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69338.log 2026-03-31T20:41:57.525 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66798.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66798.log.gz 2026-03-31T20:41:57.525 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58691.log 2026-03-31T20:41:57.525 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69338.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69338.log.gz 2026-03-31T20:41:57.525 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26580.log 2026-03-31T20:41:57.526 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58691.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58691.log.gz 2026-03-31T20:41:57.526 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51286.log 2026-03-31T20:41:57.526 INFO:teuthology.orchestra.run.vm03.stderr: 91.4% -- replaced with /var/log/ceph/ceph-mon.c.log.gz 2026-03-31T20:41:57.526 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26580.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26580.log.gz 2026-03-31T20:41:57.526 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67398.log 2026-03-31T20:41:57.527 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.51286.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51887.log 2026-03-31T20:41:57.527 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51286.log.gz 2026-03-31T20:41:57.527 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67398.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59993.log 2026-03-31T20:41:57.527 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67398.log.gz 2026-03-31T20:41:57.527 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.51887.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51887.log.gz 2026-03-31T20:41:57.527 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27603.log 2026-03-31T20:41:57.527 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59993.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44027.log 2026-03-31T20:41:57.527 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59993.log.gz 2026-03-31T20:41:57.528 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27603.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67770.log 2026-03-31T20:41:57.528 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27603.log.gz 2026-03-31T20:41:57.528 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44027.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44027.log.gz 2026-03-31T20:41:57.528 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42274.log 2026-03-31T20:41:57.528 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67770.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70328.log 2026-03-31T20:41:57.528 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67770.log.gz 2026-03-31T20:41:57.529 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42274.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42274.log.gz 2026-03-31T20:41:57.529 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53287.log 2026-03-31T20:41:57.529 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70328.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55846.log 2026-03-31T20:41:57.529 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70328.log.gz 2026-03-31T20:41:57.529 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53287.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53287.log.gz 2026-03-31T20:41:57.529 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40689.log 2026-03-31T20:41:57.530 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55846.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59753.log 2026-03-31T20:41:57.530 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55846.log.gz 2026-03-31T20:41:57.530 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40689.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40689.log.gz 2026-03-31T20:41:57.530 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58277.log 2026-03-31T20:41:57.530 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59753.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71425.log 2026-03-31T20:41:57.530 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59753.log.gz 2026-03-31T20:41:57.530 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58277.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58277.log.gz 2026-03-31T20:41:57.530 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66374.log 2026-03-31T20:41:57.531 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.71425.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59944.log 2026-03-31T20:41:57.531 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71425.log.gz 2026-03-31T20:41:57.531 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66374.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66374.log.gz 2026-03-31T20:41:57.531 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32981.log 2026-03-31T20:41:57.531 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59944.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27651.log 2026-03-31T20:41:57.531 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59944.log.gz 2026-03-31T20:41:57.531 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32981.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32981.log.gz 2026-03-31T20:41:57.531 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62158.log 2026-03-31T20:41:57.532 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27651.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27651.log.gz 2026-03-31T20:41:57.532 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78320.log 2026-03-31T20:41:57.532 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62158.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67239.log 2026-03-31T20:41:57.532 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62158.log.gz 2026-03-31T20:41:57.532 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78320.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78320.log.gz 2026-03-31T20:41:57.533 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49485.log 2026-03-31T20:41:57.533 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67239.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39236.log 2026-03-31T20:41:57.533 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67239.log.gz 2026-03-31T20:41:57.533 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49485.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49485.log.gz 2026-03-31T20:41:57.533 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54322.log 2026-03-31T20:41:57.533 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39236.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64828.log 2026-03-31T20:41:57.533 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39236.log.gz 2026-03-31T20:41:57.533 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54322.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54322.log.gz 2026-03-31T20:41:57.534 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58176.log 2026-03-31T20:41:57.534 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64828.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-ro.33101.log 2026-03-31T20:41:57.534 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64828.log.gz 2026-03-31T20:41:57.534 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58176.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58176.log.gz 2026-03-31T20:41:57.534 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52920.log 2026-03-31T20:41:57.534 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-ro.33101.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45305.log 2026-03-31T20:41:57.534 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-ro.33101.log.gz 2026-03-31T20:41:57.534 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52920.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52920.log.gz 2026-03-31T20:41:57.535 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31840.log 2026-03-31T20:41:57.535 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45305.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59704.log 2026-03-31T20:41:57.535 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45305.log.gz 2026-03-31T20:41:57.535 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31840.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31840.log.gz 2026-03-31T20:41:57.535 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47605.log 2026-03-31T20:41:57.535 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59704.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47874.log 2026-03-31T20:41:57.535 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59704.log.gz 2026-03-31T20:41:57.535 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47605.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47605.log.gz 2026-03-31T20:41:57.536 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44615.log 2026-03-31T20:41:57.536 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47874.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65806.log 2026-03-31T20:41:57.536 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47874.log.gz 2026-03-31T20:41:57.536 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44615.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44615.log.gz 2026-03-31T20:41:57.536 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.23234.log 2026-03-31T20:41:57.536 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65806.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28639.log 2026-03-31T20:41:57.536 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65806.log.gz 2026-03-31T20:41:57.536 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.23234.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.23234.log.gz 2026-03-31T20:41:57.537 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77959.log 2026-03-31T20:41:57.537 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28639.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61347.log 2026-03-31T20:41:57.537 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28639.log.gz 2026-03-31T20:41:57.537 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.77959.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77959.log.gz 2026-03-31T20:41:57.537 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38616.log 2026-03-31T20:41:57.537 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61347.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41204.log 2026-03-31T20:41:57.537 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61347.log.gz 2026-03-31T20:41:57.537 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38616.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38616.log.gz 2026-03-31T20:41:57.538 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69067.log 2026-03-31T20:41:57.538 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41204.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42578.log 2026-03-31T20:41:57.538 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41204.log.gz 2026-03-31T20:41:57.538 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69067.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69067.log.gz 2026-03-31T20:41:57.538 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48020.log 2026-03-31T20:41:57.538 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30684.log 2026-03-31T20:41:57.539 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42578.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42578.log.gz 2026-03-31T20:41:57.539 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48020.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48020.log.gz 2026-03-31T20:41:57.539 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56349.log 2026-03-31T20:41:57.539 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30684.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48336.log 2026-03-31T20:41:57.539 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30684.log.gz 2026-03-31T20:41:57.539 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56349.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56349.log.gz 2026-03-31T20:41:57.539 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78959.log 2026-03-31T20:41:57.540 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48336.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48336.log.gz 2026-03-31T20:41:57.540 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64415.log 2026-03-31T20:41:57.540 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78959.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44196.log 2026-03-31T20:41:57.540 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78959.log.gz 2026-03-31T20:41:57.540 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64415.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64415.log.gz 2026-03-31T20:41:57.541 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68345.log 2026-03-31T20:41:57.541 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44196.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70474.log 2026-03-31T20:41:57.541 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44196.log.gz 2026-03-31T20:41:57.541 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68345.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68345.log.gz 2026-03-31T20:41:57.541 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42502.log 2026-03-31T20:41:57.541 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70474.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68066.log 2026-03-31T20:41:57.541 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70474.log.gz 2026-03-31T20:41:57.542 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42502.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42502.log.gz 2026-03-31T20:41:57.542 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70498.log 2026-03-31T20:41:57.542 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68066.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47147.log 2026-03-31T20:41:57.542 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68066.log.gz 2026-03-31T20:41:57.542 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70498.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70498.log.gz -5 2026-03-31T20:41:57.542 INFO:teuthology.orchestra.run.vm03.stderr: --verbose -- /var/log/ceph/ceph-client.admin.70787.log 2026-03-31T20:41:57.543 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47147.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47147.log.gz 2026-03-31T20:41:57.543 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70667.log 2026-03-31T20:41:57.543 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70787.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26044.log 2026-03-31T20:41:57.543 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70787.log.gz 2026-03-31T20:41:57.543 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70667.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70667.log.gz 2026-03-31T20:41:57.543 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27699.log 2026-03-31T20:41:57.543 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26044.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39949.log 2026-03-31T20:41:57.544 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26044.log.gz 2026-03-31T20:41:57.544 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27699.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27699.log.gz 2026-03-31T20:41:57.544 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36848.log 2026-03-31T20:41:57.544 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39949.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45776.log 2026-03-31T20:41:57.544 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39949.log.gz 2026-03-31T20:41:57.544 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36848.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36848.log.gz 2026-03-31T20:41:57.544 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65007.log 2026-03-31T20:41:57.544 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45776.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38910.log 2026-03-31T20:41:57.545 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45776.log.gz 2026-03-31T20:41:57.545 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65007.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65007.log.gz 2026-03-31T20:41:57.545 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59438.log 2026-03-31T20:41:57.545 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38910.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68511.log 2026-03-31T20:41:57.545 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38910.log.gz 2026-03-31T20:41:57.545 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59438.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59438.log.gz 2026-03-31T20:41:57.545 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53337.log 2026-03-31T20:41:57.545 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68511.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36275.log 2026-03-31T20:41:57.545 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68511.log.gz 2026-03-31T20:41:57.546 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53337.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53337.log.gz 2026-03-31T20:41:57.546 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39382.log 2026-03-31T20:41:57.546 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36275.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49103.log 2026-03-31T20:41:57.546 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36275.log.gz 2026-03-31T20:41:57.546 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39382.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39382.log.gz 2026-03-31T20:41:57.546 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44923.log 2026-03-31T20:41:57.546 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49103.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31435.log 2026-03-31T20:41:57.546 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49103.log.gz 2026-03-31T20:41:57.547 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44923.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44923.log.gz 2026-03-31T20:41:57.547 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39406.log 2026-03-31T20:41:57.547 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31435.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35835.log 2026-03-31T20:41:57.547 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31435.log.gz 2026-03-31T20:41:57.547 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39406.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39406.log.gz 2026-03-31T20:41:57.547 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28375.log 2026-03-31T20:41:57.547 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35835.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53878.log 2026-03-31T20:41:57.547 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35835.log.gz 2026-03-31T20:41:57.548 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28375.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51651.log 2026-03-31T20:41:57.548 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53878.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28375.log.gz 2026-03-31T20:41:57.548 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53878.log.gz 2026-03-31T20:41:57.548 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40510.log 2026-03-31T20:41:57.548 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.51651.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44493.log 2026-03-31T20:41:57.548 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51651.log.gz 2026-03-31T20:41:57.549 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40510.log: gzip 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40510.log.gz -5 2026-03-31T20:41:57.549 INFO:teuthology.orchestra.run.vm03.stderr: --verbose -- /var/log/ceph/ceph-client.admin.26752.log 2026-03-31T20:41:57.549 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44493.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44493.log.gz 2026-03-31T20:41:57.549 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29495.log 2026-03-31T20:41:57.549 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26752.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58959.log 2026-03-31T20:41:57.549 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26752.log.gz 2026-03-31T20:41:57.550 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29495.log: gzip 0.0% -5 -- replaced with /var/log/ceph/ceph-client.admin.29495.log.gz --verbose 2026-03-31T20:41:57.550 INFO:teuthology.orchestra.run.vm03.stderr: -- /var/log/ceph/ceph-client.admin.62866.log 2026-03-31T20:41:57.550 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58959.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58959.log.gz 2026-03-31T20:41:57.550 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34678.log 2026-03-31T20:41:57.550 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62866.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62866.log.gz 2026-03-31T20:41:57.550 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36824.log 2026-03-31T20:41:57.551 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40766.log 2026-03-31T20:41:57.551 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34678.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34678.log.gz 2026-03-31T20:41:57.551 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36824.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36824.log.gz 2026-03-31T20:41:57.551 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63517.log 2026-03-31T20:41:57.551 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40766.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52989.log 2026-03-31T20:41:57.551 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40766.log.gz 2026-03-31T20:41:57.552 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63517.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47920.log 2026-03-31T20:41:57.552 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63517.log.gz 2026-03-31T20:41:57.552 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52989.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52989.log.gz 2026-03-31T20:41:57.552 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67870.log 2026-03-31T20:41:57.552 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47920.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66187.log 2026-03-31T20:41:57.552 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47920.log.gz 2026-03-31T20:41:57.553 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67870.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67870.log.gzgzip 2026-03-31T20:41:57.553 INFO:teuthology.orchestra.run.vm03.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.37872.log 2026-03-31T20:41:57.553 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66187.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66187.log.gz 2026-03-31T20:41:57.553 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39651.log 2026-03-31T20:41:57.553 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37872.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50115.log 2026-03-31T20:41:57.553 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37872.log.gz 2026-03-31T20:41:57.553 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39651.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71498.log 2026-03-31T20:41:57.553 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39651.log.gz 2026-03-31T20:41:57.554 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.50115.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50115.log.gz 2026-03-31T20:41:57.554 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53107.log 2026-03-31T20:41:57.554 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.71498.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71498.log.gz 2026-03-31T20:41:57.554 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30852.log 2026-03-31T20:41:57.554 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62796.log 2026-03-31T20:41:57.555 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53107.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53107.log.gz 2026-03-31T20:41:57.555 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30852.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30852.log.gz 2026-03-31T20:41:57.555 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38489.log 2026-03-31T20:41:57.555 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62796.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-ro.33077.log 2026-03-31T20:41:57.555 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62796.log.gz 2026-03-31T20:41:57.555 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38489.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38489.log.gz 2026-03-31T20:41:57.555 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56038.log 2026-03-31T20:41:57.556 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-ro.33077.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-ro.33077.log.gz 2026-03-31T20:41:57.556 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70572.log 2026-03-31T20:41:57.556 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56038.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49007.log 2026-03-31T20:41:57.556 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56038.log.gz 2026-03-31T20:41:57.556 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70572.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70572.log.gz 2026-03-31T20:41:57.556 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44468.log 2026-03-31T20:41:57.557 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49007.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-rd.33766.log 2026-03-31T20:41:57.557 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49007.log.gz 2026-03-31T20:41:57.557 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44468.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44468.log.gz 2026-03-31T20:41:57.557 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44369.log 2026-03-31T20:41:57.557 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-rd.33766.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61197.log 2026-03-31T20:41:57.557 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-rd.33766.log.gz 2026-03-31T20:41:57.557 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44369.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44369.log.gz 2026-03-31T20:41:57.558 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50346.log 2026-03-31T20:41:57.558 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61197.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42147.log 2026-03-31T20:41:57.558 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61197.log.gz 2026-03-31T20:41:57.558 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.50346.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50346.log.gz 2026-03-31T20:41:57.558 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56878.log 2026-03-31T20:41:57.558 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42147.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42963.log 2026-03-31T20:41:57.558 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42147.log.gz 2026-03-31T20:41:57.559 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56878.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56878.log.gz 2026-03-31T20:41:57.559 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69247.log 2026-03-31T20:41:57.559 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42963.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52097.log 2026-03-31T20:41:57.559 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42963.log.gz 2026-03-31T20:41:57.559 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69247.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69247.log.gz 2026-03-31T20:41:57.559 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32887.log 2026-03-31T20:41:57.559 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52097.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56158.log 2026-03-31T20:41:57.559 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52097.log.gz 2026-03-31T20:41:57.560 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32887.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32887.log.gz 2026-03-31T20:41:57.560 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50921.log 2026-03-31T20:41:57.560 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56158.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60949.log 2026-03-31T20:41:57.560 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56158.log.gz 2026-03-31T20:41:57.560 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.50921.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50921.log.gz 2026-03-31T20:41:57.560 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46601.log 2026-03-31T20:41:57.560 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60949.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30779.log 2026-03-31T20:41:57.560 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60949.log.gz 2026-03-31T20:41:57.560 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46601.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46601.log.gz 2026-03-31T20:41:57.561 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30085.log 2026-03-31T20:41:57.561 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30779.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63891.log 2026-03-31T20:41:57.561 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30779.log.gz 2026-03-31T20:41:57.561 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30085.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30085.log.gz 2026-03-31T20:41:57.561 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40381.log 2026-03-31T20:41:57.561 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63891.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55896.log 2026-03-31T20:41:57.561 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63891.log.gz 2026-03-31T20:41:57.561 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40381.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40381.log.gz 2026-03-31T20:41:57.561 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38788.log 2026-03-31T20:41:57.562 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55896.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51522.log 2026-03-31T20:41:57.562 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55896.log.gz 2026-03-31T20:41:57.562 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38788.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38788.log.gz 2026-03-31T20:41:57.562 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57659.log 2026-03-31T20:41:57.562 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.51522.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37522.log 2026-03-31T20:41:57.562 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51522.log.gz 2026-03-31T20:41:57.562 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57659.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57659.log.gz 2026-03-31T20:41:57.562 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35219.log 2026-03-31T20:41:57.563 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37522.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46114.log 2026-03-31T20:41:57.563 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37522.log.gz 2026-03-31T20:41:57.563 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35219.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35219.log.gz 2026-03-31T20:41:57.563 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53632.log 2026-03-31T20:41:57.563 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46114.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58788.log 2026-03-31T20:41:57.563 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46114.log.gz 2026-03-31T20:41:57.563 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53632.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53632.log.gz 2026-03-31T20:41:57.563 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32042.log 2026-03-31T20:41:57.564 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58788.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25756.log 2026-03-31T20:41:57.564 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58788.log.gz 2026-03-31T20:41:57.564 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32042.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32042.log.gz 2026-03-31T20:41:57.564 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64045.log 2026-03-31T20:41:57.564 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.25756.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57021.log 2026-03-31T20:41:57.564 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25756.log.gz 2026-03-31T20:41:57.564 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64045.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64045.log.gz 2026-03-31T20:41:57.564 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25854.log 2026-03-31T20:41:57.564 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57021.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69294.log 2026-03-31T20:41:57.565 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57021.log.gz 2026-03-31T20:41:57.565 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.25854.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25854.log.gz 2026-03-31T20:41:57.565 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26482.log 2026-03-31T20:41:57.565 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69294.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57208.log 2026-03-31T20:41:57.565 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69294.log.gz 2026-03-31T20:41:57.565 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26482.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26482.log.gz 2026-03-31T20:41:57.565 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27747.log 2026-03-31T20:41:57.565 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57208.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43791.log 2026-03-31T20:41:57.565 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57208.log.gz 2026-03-31T20:41:57.566 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27747.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27747.log.gz 2026-03-31T20:41:57.566 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59080.log 2026-03-31T20:41:57.566 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43791.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33939.log 2026-03-31T20:41:57.566 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43791.log.gz 2026-03-31T20:41:57.566 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59080.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59080.log.gz 2026-03-31T20:41:57.566 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57712.log 2026-03-31T20:41:57.566 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.33939.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50214.log 2026-03-31T20:41:57.566 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33939.log.gz 2026-03-31T20:41:57.567 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57712.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57712.log.gz 2026-03-31T20:41:57.567 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36562.log 2026-03-31T20:41:57.567 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.50214.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30282.log 2026-03-31T20:41:57.567 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50214.log.gz 2026-03-31T20:41:57.567 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36562.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36562.log.gz 2026-03-31T20:41:57.567 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70020.log 2026-03-31T20:41:57.567 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30282.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45700.log 2026-03-31T20:41:57.567 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30282.log.gz 2026-03-31T20:41:57.567 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70020.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70020.log.gz 2026-03-31T20:41:57.568 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68368.log 2026-03-31T20:41:57.568 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45700.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29154.log 2026-03-31T20:41:57.568 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45700.log.gz 2026-03-31T20:41:57.568 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68368.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68368.log.gz 2026-03-31T20:41:57.568 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49271.log 2026-03-31T20:41:57.568 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29154.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78615.log 2026-03-31T20:41:57.568 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29154.log.gz 2026-03-31T20:41:57.568 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49271.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49271.log.gz 2026-03-31T20:41:57.568 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55459.log 2026-03-31T20:41:57.569 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78615.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35289.log 2026-03-31T20:41:57.569 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78615.log.gz 2026-03-31T20:41:57.569 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55459.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55459.log.gz 2026-03-31T20:41:57.569 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58010.log 2026-03-31T20:41:57.569 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35289.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47436.log 2026-03-31T20:41:57.569 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35289.log.gz 2026-03-31T20:41:57.569 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58010.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58010.log.gz 2026-03-31T20:41:57.569 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38666.log 2026-03-31T20:41:57.570 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47436.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46841.log 2026-03-31T20:41:57.570 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47436.log.gz 2026-03-31T20:41:57.570 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38666.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38666.log.gz 2026-03-31T20:41:57.570 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36704.log 2026-03-31T20:41:57.570 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46841.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35243.log 2026-03-31T20:41:57.570 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46841.log.gz 2026-03-31T20:41:57.570 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36704.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53195.log 2026-03-31T20:41:57.570 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36704.log.gz 2026-03-31T20:41:57.571 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35243.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35243.log.gz 2026-03-31T20:41:57.571 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39096.log 2026-03-31T20:41:57.571 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53195.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27511.log 2026-03-31T20:41:57.571 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53195.log.gz 2026-03-31T20:41:57.571 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39096.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45825.log 2026-03-31T20:41:57.571 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39096.log.gz 2026-03-31T20:41:57.572 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27511.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27511.log.gz 2026-03-31T20:41:57.572 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50189.log 2026-03-31T20:41:57.572 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45825.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39823.log 2026-03-31T20:41:57.572 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45825.log.gz 2026-03-31T20:41:57.572 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.50189.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-rw.33338.log 2026-03-31T20:41:57.572 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50189.log.gz 2026-03-31T20:41:57.573 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39823.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39823.log.gz 2026-03-31T20:41:57.573 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38157.log 2026-03-31T20:41:57.573 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-rw.33338.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29422.log 2026-03-31T20:41:57.573 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-rw.33338.log.gz 2026-03-31T20:41:57.573 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38157.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38157.log.gz 2026-03-31T20:41:57.573 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54182.log 2026-03-31T20:41:57.574 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29422.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54495.log 2026-03-31T20:41:57.574 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29422.log.gz 2026-03-31T20:41:57.574 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54182.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54182.log.gz 2026-03-31T20:41:57.574 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28102.log 2026-03-31T20:41:57.574 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54495.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47122.log 2026-03-31T20:41:57.574 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54495.log.gz 2026-03-31T20:41:57.575 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28102.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60180.log 2026-03-31T20:41:57.575 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28102.log.gz 2026-03-31T20:41:57.575 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47122.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47122.log.gz 2026-03-31T20:41:57.575 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35634.log 2026-03-31T20:41:57.575 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60180.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37802.log 2026-03-31T20:41:57.575 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60180.log.gz 2026-03-31T20:41:57.576 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35634.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26826.log 2026-03-31T20:41:57.576 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35634.log.gz 2026-03-31T20:41:57.576 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37802.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37802.log.gz 2026-03-31T20:41:57.576 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28191.log 2026-03-31T20:41:57.576 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26826.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60472.log 2026-03-31T20:41:57.576 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26826.log.gz 2026-03-31T20:41:57.576 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28191.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28191.log.gz 2026-03-31T20:41:57.576 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78152.log 2026-03-31T20:41:57.577 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60472.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60472.log.gz 2026-03-31T20:41:57.577 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29812.log 2026-03-31T20:41:57.577 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78152.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49943.log 2026-03-31T20:41:57.577 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78152.log.gz 2026-03-31T20:41:57.577 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29812.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29812.log.gzgzip 2026-03-31T20:41:57.577 INFO:teuthology.orchestra.run.vm03.stderr: -5 --verbose -- /var/log/ceph/ceph-client.admin.30902.log 2026-03-31T20:41:57.578 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49943.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49943.log.gz 2026-03-31T20:41:57.578 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62268.log 2026-03-31T20:41:57.578 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30902.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29400.log 2026-03-31T20:41:57.578 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30902.log.gz 2026-03-31T20:41:57.578 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62268.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62268.log.gz 2026-03-31T20:41:57.578 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52761.log 2026-03-31T20:41:57.578 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29400.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29400.log.gz 2026-03-31T20:41:57.579 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35762.log 2026-03-31T20:41:57.579 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52761.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44074.log 2026-03-31T20:41:57.579 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52761.log.gz 2026-03-31T20:41:57.579 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35762.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46232.log 2026-03-31T20:41:57.579 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35762.log.gz 2026-03-31T20:41:57.579 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44074.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44074.log.gz 2026-03-31T20:41:57.580 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60018.log 2026-03-31T20:41:57.580 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46232.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66519.log 2026-03-31T20:41:57.580 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46232.log.gz 2026-03-31T20:41:57.580 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60018.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60018.log.gz 2026-03-31T20:41:57.580 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37162.log 2026-03-31T20:41:57.581 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66519.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66519.log.gz 2026-03-31T20:41:57.581 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49533.log 2026-03-31T20:41:57.581 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37162.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64596.log 2026-03-31T20:41:57.581 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37162.log.gz 2026-03-31T20:41:57.581 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49533.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49533.log.gz 2026-03-31T20:41:57.582 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41302.log 2026-03-31T20:41:57.582 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64596.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26777.log 2026-03-31T20:41:57.582 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64596.log.gz 2026-03-31T20:41:57.582 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41302.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41302.log.gz 2026-03-31T20:41:57.582 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56902.log 2026-03-31T20:41:57.583 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26777.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69045.log 2026-03-31T20:41:57.583 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26777.log.gz 2026-03-31T20:41:57.583 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56902.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56902.log.gz 2026-03-31T20:41:57.583 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37114.log 2026-03-31T20:41:57.583 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69045.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36177.log 2026-03-31T20:41:57.583 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69045.log.gz 2026-03-31T20:41:57.583 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37114.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37114.log.gz 2026-03-31T20:41:57.584 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66048.log 2026-03-31T20:41:57.584 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36177.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41917.log 2026-03-31T20:41:57.584 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36177.log.gz 2026-03-31T20:41:57.584 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66048.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28453.log 2026-03-31T20:41:57.584 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66048.log.gz 2026-03-31T20:41:57.584 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41917.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41917.log.gz 2026-03-31T20:41:57.585 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66744.log 2026-03-31T20:41:57.585 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28453.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68973.log 2026-03-31T20:41:57.585 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28453.log.gz 2026-03-31T20:41:57.585 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66744.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60496.log 2026-03-31T20:41:57.585 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66744.log.gz 2026-03-31T20:41:57.585 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68973.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68973.log.gz 2026-03-31T20:41:57.585 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26334.log 2026-03-31T20:41:57.586 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60496.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49223.log 2026-03-31T20:41:57.586 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60496.log.gz 2026-03-31T20:41:57.586 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26334.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26334.log.gz 2026-03-31T20:41:57.586 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32532.log 2026-03-31T20:41:57.586 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49223.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49223.log.gz 2026-03-31T20:41:57.586 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59656.log 2026-03-31T20:41:57.587 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32532.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63469.log 2026-03-31T20:41:57.587 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32532.log.gz 2026-03-31T20:41:57.587 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59656.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59656.log.gz 2026-03-31T20:41:57.587 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35932.log 2026-03-31T20:41:57.587 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63469.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63469.log.gz 2026-03-31T20:41:57.587 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26036.log 2026-03-31T20:41:57.588 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35932.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55677.log 2026-03-31T20:41:57.588 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35932.log.gz 2026-03-31T20:41:57.588 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26036.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26036.log.gz 2026-03-31T20:41:57.588 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65256.log 2026-03-31T20:41:57.588 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55677.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55677.log.gz 2026-03-31T20:41:57.588 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60349.log 2026-03-31T20:41:57.589 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65256.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34386.log 2026-03-31T20:41:57.589 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65256.log.gz 2026-03-31T20:41:57.589 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60349.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53758.log 2026-03-31T20:41:57.589 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60349.log.gz 2026-03-31T20:41:57.589 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34386.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34386.log.gz 2026-03-31T20:41:57.589 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35861.log 2026-03-31T20:41:57.590 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53758.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46981.log 2026-03-31T20:41:57.590 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53758.log.gz 2026-03-31T20:41:57.590 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35861.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35861.log.gz 2026-03-31T20:41:57.590 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50475.log 2026-03-31T20:41:57.590 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46981.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46981.log.gz 2026-03-31T20:41:57.590 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56689.log 2026-03-31T20:41:57.591 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.50475.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30878.log 2026-03-31T20:41:57.591 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50475.log.gz 2026-03-31T20:41:57.591 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56689.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56689.log.gz 2026-03-31T20:41:57.591 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62045.log 2026-03-31T20:41:57.591 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30878.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30878.log.gz 2026-03-31T20:41:57.591 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28781.log 2026-03-31T20:41:57.592 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62045.log: gzip -5 --verbose 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62045.log.gz 2026-03-31T20:41:57.592 INFO:teuthology.orchestra.run.vm03.stderr: -- /var/log/ceph/ceph-client.admin.39775.log 2026-03-31T20:41:57.592 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28781.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28781.log.gz 2026-03-31T20:41:57.592 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.64773.log 2026-03-31T20:41:57.592 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39775.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67722.log 2026-03-31T20:41:57.592 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39775.log.gz 2026-03-31T20:41:57.592 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.64773.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.64773.log.gz 2026-03-31T20:41:57.592 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36872.log 2026-03-31T20:41:57.592 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67722.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29909.log 2026-03-31T20:41:57.592 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67722.log.gz 2026-03-31T20:41:57.593 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36872.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36872.log.gz 2026-03-31T20:41:57.593 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59174.log 2026-03-31T20:41:57.593 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29909.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27417.log 2026-03-31T20:41:57.593 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29909.log.gz 2026-03-31T20:41:57.593 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59174.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59174.log.gz 2026-03-31T20:41:57.593 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.27034.log 2026-03-31T20:41:57.593 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27417.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40000.log 2026-03-31T20:41:57.593 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27417.log.gz 2026-03-31T20:41:57.594 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.27034.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.27034.log.gz 2026-03-31T20:41:57.594 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56495.log 2026-03-31T20:41:57.594 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40000.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34887.log 2026-03-31T20:41:57.594 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40000.log.gz 2026-03-31T20:41:57.594 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56495.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56495.log.gz 2026-03-31T20:41:57.594 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60641.log 2026-03-31T20:41:57.594 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34887.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.38280.log 2026-03-31T20:41:57.594 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34887.log.gz 2026-03-31T20:41:57.594 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60641.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60641.log.gz 2026-03-31T20:41:57.595 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63376.log 2026-03-31T20:41:57.595 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.38280.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63748.log 2026-03-31T20:41:57.595 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.38280.log.gz 2026-03-31T20:41:57.595 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63376.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63376.log.gz 2026-03-31T20:41:57.595 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41076.log 2026-03-31T20:41:57.595 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63748.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62510.log 2026-03-31T20:41:57.595 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63748.log.gz 2026-03-31T20:41:57.595 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41076.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41076.log.gz 2026-03-31T20:41:57.595 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54237.log 2026-03-31T20:41:57.596 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69536.log 2026-03-31T20:41:57.596 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62510.log: /var/log/ceph/ceph-client.admin.54237.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62510.log.gz 2026-03-31T20:41:57.596 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54237.log.gz 2026-03-31T20:41:57.596 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66097.log 2026-03-31T20:41:57.596 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69536.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71640.log 2026-03-31T20:41:57.596 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69536.log.gz 2026-03-31T20:41:57.596 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66097.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66097.log.gz 2026-03-31T20:41:57.597 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78178.log 2026-03-31T20:41:57.597 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.71640.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43767.log 2026-03-31T20:41:57.597 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71640.log.gz 2026-03-31T20:41:57.597 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78178.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78178.log.gz 2026-03-31T20:41:57.597 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53903.log 2026-03-31T20:41:57.598 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43767.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66641.log 2026-03-31T20:41:57.598 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43767.log.gz 2026-03-31T20:41:57.598 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53903.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53903.log.gz 2026-03-31T20:41:57.598 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26531.log 2026-03-31T20:41:57.598 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66641.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66641.log.gz 2026-03-31T20:41:57.598 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78519.log 2026-03-31T20:41:57.599 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43620.log 2026-03-31T20:41:57.599 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26531.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26531.log.gz 2026-03-31T20:41:57.599 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78519.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78519.log.gz 2026-03-31T20:41:57.599 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71270.log 2026-03-31T20:41:57.599 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43620.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29251.log 2026-03-31T20:41:57.599 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43620.log.gz 2026-03-31T20:41:57.599 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.71270.log: 39.8% -- replaced with /var/log/ceph/ceph-client.admin.71270.log.gz 2026-03-31T20:41:57.599 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34260.log 2026-03-31T20:41:57.600 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29251.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63354.log 2026-03-31T20:41:57.600 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29251.log.gz 2026-03-31T20:41:57.600 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34260.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34260.log.gz 2026-03-31T20:41:57.600 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32159.log 2026-03-31T20:41:57.600 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63354.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54471.log 2026-03-31T20:41:57.600 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63354.log.gz 2026-03-31T20:41:57.600 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32159.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32159.log.gz 2026-03-31T20:41:57.600 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77913.log 2026-03-31T20:41:57.601 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54471.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77891.log 2026-03-31T20:41:57.601 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54471.log.gz 2026-03-31T20:41:57.601 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.77913.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77913.log.gz 2026-03-31T20:41:57.601 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54130.log 2026-03-31T20:41:57.601 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.77891.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42324.log 2026-03-31T20:41:57.601 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77891.log.gz 2026-03-31T20:41:57.601 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54130.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54130.log.gz 2026-03-31T20:41:57.601 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69117.log 2026-03-31T20:41:57.602 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42324.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.39456.log 2026-03-31T20:41:57.602 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42324.log.gz 2026-03-31T20:41:57.602 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69117.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69117.log.gz 2026-03-31T20:41:57.602 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55101.log 2026-03-31T20:41:57.602 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.39456.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69732.log 2026-03-31T20:41:57.602 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.39456.log.gz 2026-03-31T20:41:57.602 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55101.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55101.log.gz 2026-03-31T20:41:57.602 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50398.log 2026-03-31T20:41:57.603 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69732.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71006.log 2026-03-31T20:41:57.603 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69732.log.gz 2026-03-31T20:41:57.603 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.50398.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50398.log.gz 2026-03-31T20:41:57.603 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70257.log 2026-03-31T20:41:57.603 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.71006.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61147.log 2026-03-31T20:41:57.603 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71006.log.gz 2026-03-31T20:41:57.603 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70257.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70257.log.gz 2026-03-31T20:41:57.603 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45254.log 2026-03-31T20:41:57.604 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61147.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55557.log 2026-03-31T20:41:57.604 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61147.log.gz 2026-03-31T20:41:57.604 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45254.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45254.log.gz 2026-03-31T20:41:57.604 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67340.log 2026-03-31T20:41:57.604 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55557.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40151.log 2026-03-31T20:41:57.604 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55557.log.gz 2026-03-31T20:41:57.604 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67340.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67340.log.gz 2026-03-31T20:41:57.604 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35435.log 2026-03-31T20:41:57.605 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40151.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31179.log 2026-03-31T20:41:57.605 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40151.log.gz 2026-03-31T20:41:57.605 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35435.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35435.log.gz 2026-03-31T20:41:57.605 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41177.log 2026-03-31T20:41:57.605 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31179.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.33914.log 2026-03-31T20:41:57.605 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31179.log.gz 2026-03-31T20:41:57.605 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41177.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41177.log.gz 2026-03-31T20:41:57.605 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48838.log 2026-03-31T20:41:57.606 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.33914.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37306.log 2026-03-31T20:41:57.606 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.33914.log.gz 2026-03-31T20:41:57.606 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48838.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48838.log.gz 2026-03-31T20:41:57.606 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.77869.log 2026-03-31T20:41:57.606 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37306.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70811.log 2026-03-31T20:41:57.606 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37306.log.gz 2026-03-31T20:41:57.606 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.77869.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.77869.log.gz 2026-03-31T20:41:57.606 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34911.log 2026-03-31T20:41:57.607 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70811.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55871.log 2026-03-31T20:41:57.607 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70811.log.gz 2026-03-31T20:41:57.607 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34911.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34911.log.gz 2026-03-31T20:41:57.607 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36538.log 2026-03-31T20:41:57.607 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55871.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78779.log 2026-03-31T20:41:57.607 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55871.log.gz 2026-03-31T20:41:57.607 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36538.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36538.log.gz 2026-03-31T20:41:57.607 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28663.log 2026-03-31T20:41:57.608 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36152.log 2026-03-31T20:41:57.608 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78779.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78779.log.gz 2026-03-31T20:41:57.608 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28663.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28663.log.gz 2026-03-31T20:41:57.608 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32115.log 2026-03-31T20:41:57.608 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36152.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.23085.log 2026-03-31T20:41:57.608 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36152.log.gz 2026-03-31T20:41:57.608 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32115.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32115.log.gz 2026-03-31T20:41:57.609 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63941.log 2026-03-31T20:41:57.609 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.23085.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45377.log 2026-03-31T20:41:57.609 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.23085.log.gz 2026-03-31T20:41:57.609 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63941.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63941.log.gz 2026-03-31T20:41:57.609 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55797.log 2026-03-31T20:41:57.609 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45377.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57806.log 2026-03-31T20:41:57.609 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45377.log.gz 2026-03-31T20:41:57.609 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55797.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55797.log.gz 2026-03-31T20:41:57.610 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57140.log 2026-03-31T20:41:57.610 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57806.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47387.log 2026-03-31T20:41:57.610 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57806.log.gz 2026-03-31T20:41:57.610 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57140.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57140.log.gz 2026-03-31T20:41:57.610 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31664.log 2026-03-31T20:41:57.610 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37040.log 2026-03-31T20:41:57.611 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47387.log: /var/log/ceph/ceph-client.admin.31664.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47387.log.gz 2026-03-31T20:41:57.611 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31664.log.gz 2026-03-31T20:41:57.611 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35982.log 2026-03-31T20:41:57.611 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54156.log 2026-03-31T20:41:57.611 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37040.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37040.log.gz 2026-03-31T20:41:57.611 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35982.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35982.log.gz 2026-03-31T20:41:57.611 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.63493.log 2026-03-31T20:41:57.612 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54156.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41737.log 2026-03-31T20:41:57.612 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54156.log.gz 2026-03-31T20:41:57.612 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.63493.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.63493.log.gz 2026-03-31T20:41:57.612 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57116.log 2026-03-31T20:41:57.612 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41737.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28290.log 2026-03-31T20:41:57.612 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41737.log.gz 2026-03-31T20:41:57.612 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57116.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57116.log.gz 2026-03-31T20:41:57.613 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.68217.log 2026-03-31T20:41:57.613 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28290.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55268.log 2026-03-31T20:41:57.613 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28290.log.gz 2026-03-31T20:41:57.613 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.68217.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.68217.log.gz 2026-03-31T20:41:57.613 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.48072.log 2026-03-31T20:41:57.613 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55268.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41480.log 2026-03-31T20:41:57.613 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55268.log.gz 2026-03-31T20:41:57.614 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.48072.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.48072.log.gz 2026-03-31T20:41:57.614 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67894.log 2026-03-31T20:41:57.614 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41480.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.42759.log 2026-03-31T20:41:57.614 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41480.log.gz 2026-03-31T20:41:57.614 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67894.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67894.log.gz 2026-03-31T20:41:57.615 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60090.log 2026-03-31T20:41:57.615 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.42759.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41790.log 2026-03-31T20:41:57.615 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.42759.log.gz 2026-03-31T20:41:57.615 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60090.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60090.log.gz 2026-03-31T20:41:57.615 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59680.log 2026-03-31T20:41:57.615 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41790.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.69511.log 2026-03-31T20:41:57.615 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41790.log.gz 2026-03-31T20:41:57.615 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59680.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59680.log.gz 2026-03-31T20:41:57.615 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40536.log 2026-03-31T20:41:57.616 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.69511.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59777.log 2026-03-31T20:41:57.616 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.69511.log.gz 2026-03-31T20:41:57.616 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40536.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40536.log.gz 2026-03-31T20:41:57.616 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.57393.log 2026-03-31T20:41:57.616 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59777.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40484.log 2026-03-31T20:41:57.616 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59777.log.gz 2026-03-31T20:41:57.616 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.57393.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.57393.log.gz 2026-03-31T20:41:57.616 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71054.log 2026-03-31T20:41:57.617 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40484.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54759.log 2026-03-31T20:41:57.617 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40484.log.gz 2026-03-31T20:41:57.617 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.71054.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71054.log.gz 2026-03-31T20:41:57.617 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-ro.33172.log 2026-03-31T20:41:57.617 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54759.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.bug.64510.log 2026-03-31T20:41:57.617 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54759.log.gz 2026-03-31T20:41:57.617 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-ro.33172.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-ro.33172.log.gz 2026-03-31T20:41:57.617 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53466.log 2026-03-31T20:41:57.618 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.bug.64510.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47197.log 2026-03-31T20:41:57.618 INFO:teuthology.orchestra.run.vm03.stderr: 52.5% -- replaced with /var/log/ceph/ceph-client.bug.64510.log.gz 2026-03-31T20:41:57.618 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53466.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53466.log.gz 2026-03-31T20:41:57.618 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.37234.log 2026-03-31T20:41:57.618 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47197.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.40639.log 2026-03-31T20:41:57.618 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47197.log.gz 2026-03-31T20:41:57.618 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.37234.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.37234.log.gz 2026-03-31T20:41:57.618 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.62356.log 2026-03-31T20:41:57.619 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.40639.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.55197.log 2026-03-31T20:41:57.619 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.40639.log.gz 2026-03-31T20:41:57.619 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.62356.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.62356.log.gz 2026-03-31T20:41:57.619 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59245.log 2026-03-31T20:41:57.619 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.55197.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.71305.log 2026-03-31T20:41:57.619 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.55197.log.gz 2026-03-31T20:41:57.619 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59245.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59245.log.gz 2026-03-31T20:41:57.619 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.52384.log 2026-03-31T20:41:57.619 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.71305.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.25886.log 2026-03-31T20:41:57.619 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.71305.log.gz 2026-03-31T20:41:57.620 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.52384.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.52384.log.gz 2026-03-31T20:41:57.620 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-osd.2.log 2026-03-31T20:41:57.620 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.25886.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60998.log 2026-03-31T20:41:57.620 INFO:teuthology.orchestra.run.vm03.stderr: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.25886.log.gz 2026-03-31T20:41:57.620 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-osd.2.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43364.log 2026-03-31T20:41:57.620 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60998.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60998.log.gz 2026-03-31T20:41:57.621 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43241.log 2026-03-31T20:41:57.621 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43364.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43364.log.gz 2026-03-31T20:41:57.621 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36992.log 2026-03-31T20:41:57.621 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43241.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43241.log.gz 2026-03-31T20:41:57.621 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67189.log 2026-03-31T20:41:57.622 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36992.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36992.log.gz 2026-03-31T20:41:57.622 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46531.log 2026-03-31T20:41:57.622 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67189.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67189.log.gz 2026-03-31T20:41:57.622 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29595.log 2026-03-31T20:41:57.623 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46531.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46531.log.gz 2026-03-31T20:41:57.636 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.46386.log 2026-03-31T20:41:57.636 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29595.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29595.log.gz 2026-03-31T20:41:57.636 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49391.log 2026-03-31T20:41:57.637 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.46386.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.46386.log.gz 2026-03-31T20:41:57.637 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.29350.log 2026-03-31T20:41:57.637 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49391.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49391.log.gz 2026-03-31T20:41:57.637 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.70597.log 2026-03-31T20:41:57.638 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.29350.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.29350.log.gz 2026-03-31T20:41:57.638 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.58036.log 2026-03-31T20:41:57.638 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.70597.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.70597.log.gz 2026-03-31T20:41:57.638 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.41506.log 2026-03-31T20:41:57.638 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.58036.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.58036.log.gz 2026-03-31T20:41:57.639 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.51548.log 2026-03-31T20:41:57.639 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.41506.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.41506.log.gz 2026-03-31T20:41:57.639 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.32406.log 2026-03-31T20:41:57.639 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.51548.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.51548.log.gz 2026-03-31T20:41:57.640 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66424.log 2026-03-31T20:41:57.640 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.32406.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.32406.log.gz 2026-03-31T20:41:57.640 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.34463.log 2026-03-31T20:41:57.640 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66424.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66424.log.gz 2026-03-31T20:41:57.640 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.xx-profile-rw.33413.log 2026-03-31T20:41:57.641 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.34463.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.34463.log.gz 2026-03-31T20:41:57.641 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.54347.log 2026-03-31T20:41:57.641 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.xx-profile-rw.33413.log: 0.0% -- replaced with /var/log/ceph/ceph-client.xx-profile-rw.33413.log.gz 2026-03-31T20:41:57.641 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.44831.log 2026-03-31T20:41:57.642 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.54347.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.54347.log.gz 2026-03-31T20:41:57.642 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53709.log 2026-03-31T20:41:57.642 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.44831.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.44831.log.gz 2026-03-31T20:41:57.642 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.53171.log 2026-03-31T20:41:57.642 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53709.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53709.log.gz 2026-03-31T20:41:57.643 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.36226.log 2026-03-31T20:41:57.643 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.53171.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.53171.log.gz 2026-03-31T20:41:57.643 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.78005.log 2026-03-31T20:41:57.643 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.36226.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.36226.log.gz 2026-03-31T20:41:57.644 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.50973.log 2026-03-31T20:41:57.644 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.78005.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.78005.log.gz 2026-03-31T20:41:57.644 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.47289.log 2026-03-31T20:41:57.644 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.50973.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.50973.log.gz 2026-03-31T20:41:57.644 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.28501.log 2026-03-31T20:41:57.645 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.47289.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.47289.log.gz 2026-03-31T20:41:57.645 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.56543.log 2026-03-31T20:41:57.645 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.28501.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.28501.log.gz 2026-03-31T20:41:57.645 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.45455.log 2026-03-31T20:41:57.646 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.56543.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.56543.log.gz 2026-03-31T20:41:57.646 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.65227.log 2026-03-31T20:41:57.646 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.45455.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.45455.log.gz 2026-03-31T20:41:57.646 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60229.log 2026-03-31T20:41:57.646 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.65227.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.65227.log.gz 2026-03-31T20:41:57.647 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60973.log 2026-03-31T20:41:57.647 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60229.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60229.log.gz 2026-03-31T20:41:57.647 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-mon.a.log 2026-03-31T20:41:57.647 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60973.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60973.log.gz 2026-03-31T20:41:57.648 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin2.32725.log 2026-03-31T20:41:57.660 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-mon.a.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.59535.log 2026-03-31T20:41:57.660 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin2.32725.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin2.32725.log.gz 2026-03-31T20:41:57.668 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.49079.log 2026-03-31T20:41:57.668 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.59535.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.59535.log.gz 2026-03-31T20:41:57.676 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.31814.log 2026-03-31T20:41:57.676 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.49079.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.49079.log.gz 2026-03-31T20:41:57.684 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.35121.log 2026-03-31T20:41:57.684 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.31814.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.31814.log.gz 2026-03-31T20:41:57.700 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.66544.log 2026-03-31T20:41:57.700 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.35121.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.35121.log.gz 2026-03-31T20:41:57.712 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30440.log 2026-03-31T20:41:57.712 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.66544.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.66544.log.gz 2026-03-31T20:41:57.724 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.43090.log 2026-03-31T20:41:57.724 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30440.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30440.log.gz 2026-03-31T20:41:57.732 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.26654.log 2026-03-31T20:41:57.732 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.43090.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.43090.log.gz 2026-03-31T20:41:57.748 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.61858.log 2026-03-31T20:41:57.748 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.26654.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.26654.log.gz 2026-03-31T20:41:57.764 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.67314.log 2026-03-31T20:41:57.764 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.61858.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.61858.log.gz 2026-03-31T20:41:57.772 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.60687.log 2026-03-31T20:41:57.772 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.67314.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.67314.log.gz 2026-03-31T20:41:57.780 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /var/log/ceph/ceph-client.admin.30828.log 2026-03-31T20:41:57.780 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.60687.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.60687.log.gz 2026-03-31T20:41:57.788 INFO:teuthology.orchestra.run.vm03.stderr:/var/log/ceph/ceph-client.admin.30828.log: 0.0% -- replaced with /var/log/ceph/ceph-client.admin.30828.log.gz 2026-03-31T20:41:58.470 INFO:teuthology.orchestra.run.vm03.stderr: 90.9% -- replaced with /var/log/ceph/ceph-mon.a.log.gz 2026-03-31T20:41:58.603 INFO:teuthology.orchestra.run.vm03.stderr: 93.1% -- replaced with /var/log/ceph/ceph-osd.1.log.gz 2026-03-31T20:41:59.015 INFO:teuthology.orchestra.run.vm03.stderr: 93.7% -- replaced with /var/log/ceph/ceph-osd.0.log.gz 2026-03-31T20:41:59.165 INFO:teuthology.orchestra.run.vm03.stderr: 93.3% -- replaced with /var/log/ceph/ceph-osd.2.log.gz 2026-03-31T20:41:59.166 INFO:teuthology.orchestra.run.vm03.stderr: 2026-03-31T20:41:59.166 INFO:teuthology.orchestra.run.vm03.stderr:real 0m2.269s 2026-03-31T20:41:59.166 INFO:teuthology.orchestra.run.vm03.stderr:user 0m6.856s 2026-03-31T20:41:59.166 INFO:teuthology.orchestra.run.vm03.stderr:sys 0m0.526s 2026-03-31T20:41:59.166 INFO:tasks.ceph:Archiving logs... 2026-03-31T20:41:59.166 DEBUG:teuthology.misc:Transferring archived files from vm03:/var/log/ceph to /archive/kyr-2026-03-31_11:18:10-rados-tentacle-none-default-vps/4344/remote/vm03/log 2026-03-31T20:41:59.166 DEBUG:teuthology.orchestra.run.vm03:> sudo tar c -f - -C /var/log/ceph -- . 2026-03-31T20:41:59.629 DEBUG:teuthology.run_tasks:Unwinding manager install 2026-03-31T20:41:59.631 INFO:teuthology.task.install.util:Removing shipped files: /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer... 2026-03-31T20:41:59.631 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-31T20:41:59.676 INFO:teuthology.task.install.deb:Removing packages: ceph, cephadm, ceph-mds, ceph-mgr, ceph-common, ceph-fuse, ceph-test, ceph-volume, radosgw, python3-rados, python3-rgw, python3-cephfs, python3-rbd, libcephfs2, libcephfs-dev, librados2, librbd1, rbd-fuse on Debian system. 2026-03-31T20:41:59.677 DEBUG:teuthology.orchestra.run.vm03:> for d in ceph cephadm ceph-mds ceph-mgr ceph-common ceph-fuse ceph-test ceph-volume radosgw python3-rados python3-rgw python3-cephfs python3-rbd libcephfs2 libcephfs-dev librados2 librbd1 rbd-fuse ; do sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" purge $d || true ; done 2026-03-31T20:41:59.743 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-31T20:41:59.861 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-31T20:41:59.861 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-31T20:41:59.951 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-31T20:41:59.951 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-31T20:41:59.951 INFO:teuthology.orchestra.run.vm03.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-31T20:41:59.951 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-31T20:41:59.958 INFO:teuthology.orchestra.run.vm03.stdout:The following packages will be REMOVED: 2026-03-31T20:41:59.959 INFO:teuthology.orchestra.run.vm03.stdout: ceph* 2026-03-31T20:42:00.113 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 1 to remove and 50 not upgraded. 2026-03-31T20:42:00.113 INFO:teuthology.orchestra.run.vm03.stdout:After this operation, 47.1 kB disk space will be freed. 2026-03-31T20:42:00.147 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126150 files and directories currently installed.) 2026-03-31T20:42:00.148 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:00.666 INFO:tasks.daemonwatchdog.daemon_watchdog:daemon ceph.osd.2 has been restored 2026-03-31T20:42:00.666 INFO:tasks.daemonwatchdog.daemon_watchdog:daemon ceph.mon.a has been restored 2026-03-31T20:42:00.666 INFO:tasks.daemonwatchdog.daemon_watchdog:daemon ceph.mon.b has been restored 2026-03-31T20:42:00.666 INFO:tasks.daemonwatchdog.daemon_watchdog:daemon ceph.mon.c has been restored 2026-03-31T20:42:00.995 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-31T20:42:01.028 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-31T20:42:01.151 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-31T20:42:01.151 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-31T20:42:01.246 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-31T20:42:01.247 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-31T20:42:01.247 INFO:teuthology.orchestra.run.vm03.stdout: libsgutils2-2 python-asyncssh-doc python3-asyncssh sg3-utils sg3-utils-udev 2026-03-31T20:42:01.247 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-31T20:42:01.254 INFO:teuthology.orchestra.run.vm03.stdout:The following packages will be REMOVED: 2026-03-31T20:42:01.255 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-cephadm* cephadm* 2026-03-31T20:42:01.397 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 2 to remove and 50 not upgraded. 2026-03-31T20:42:01.397 INFO:teuthology.orchestra.run.vm03.stdout:After this operation, 2177 kB disk space will be freed. 2026-03-31T20:42:01.430 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126148 files and directories currently installed.) 2026-03-31T20:42:01.432 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-mgr-cephadm (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:01.441 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-cephadm, directory '/usr/share/ceph/mgr/cephadm/services' not empty so not removed 2026-03-31T20:42:01.449 INFO:teuthology.orchestra.run.vm03.stdout:Removing cephadm (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:01.476 INFO:teuthology.orchestra.run.vm03.stdout:Looking for files to backup/remove ... 2026-03-31T20:42:01.478 INFO:teuthology.orchestra.run.vm03.stdout:Not backing up/removing `/var/lib/cephadm', it matches ^/var/.*. 2026-03-31T20:42:01.479 INFO:teuthology.orchestra.run.vm03.stdout:Removing user `cephadm' ... 2026-03-31T20:42:01.479 INFO:teuthology.orchestra.run.vm03.stdout:Warning: group `nogroup' has no more members. 2026-03-31T20:42:01.496 INFO:teuthology.orchestra.run.vm03.stdout:Done. 2026-03-31T20:42:01.515 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-31T20:42:01.603 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126062 files and directories currently installed.) 2026-03-31T20:42:01.605 INFO:teuthology.orchestra.run.vm03.stdout:Purging configuration files for cephadm (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:02.430 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-31T20:42:02.462 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-31T20:42:02.586 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-31T20:42:02.587 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-31T20:42:02.680 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-31T20:42:02.680 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-31T20:42:02.681 INFO:teuthology.orchestra.run.vm03.stdout: libsgutils2-2 python-asyncssh-doc python3-asyncssh sg3-utils sg3-utils-udev 2026-03-31T20:42:02.681 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-31T20:42:02.688 INFO:teuthology.orchestra.run.vm03.stdout:The following packages will be REMOVED: 2026-03-31T20:42:02.688 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mds* 2026-03-31T20:42:02.828 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 1 to remove and 50 not upgraded. 2026-03-31T20:42:02.828 INFO:teuthology.orchestra.run.vm03.stdout:After this operation, 6851 kB disk space will be freed. 2026-03-31T20:42:02.861 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126062 files and directories currently installed.) 2026-03-31T20:42:02.863 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-mds (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:03.243 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-31T20:42:03.324 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126054 files and directories currently installed.) 2026-03-31T20:42:03.326 INFO:teuthology.orchestra.run.vm03.stdout:Purging configuration files for ceph-mds (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:04.501 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-31T20:42:04.534 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-31T20:42:04.659 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-31T20:42:04.659 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-31T20:42:04.755 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-31T20:42:04.755 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core ceph-mon kpartx libboost-iostreams1.74.0 2026-03-31T20:42:04.755 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libpmemobj1 libsgutils2-2 python-asyncssh-doc 2026-03-31T20:42:04.755 INFO:teuthology.orchestra.run.vm03.stdout: python3-asyncssh python3-cachetools python3-cheroot python3-cherrypy3 2026-03-31T20:42:04.755 INFO:teuthology.orchestra.run.vm03.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-31T20:42:04.755 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-31T20:42:04.755 INFO:teuthology.orchestra.run.vm03.stdout: python3-kubernetes python3-natsort python3-portend python3-psutil 2026-03-31T20:42:04.755 INFO:teuthology.orchestra.run.vm03.stdout: python3-repoze.lru python3-requests-oauthlib python3-routes python3-rsa 2026-03-31T20:42:04.756 INFO:teuthology.orchestra.run.vm03.stdout: python3-simplejson python3-sklearn python3-sklearn-lib python3-tempora 2026-03-31T20:42:04.756 INFO:teuthology.orchestra.run.vm03.stdout: python3-threadpoolctl python3-webob python3-websocket python3-zc.lockfile 2026-03-31T20:42:04.756 INFO:teuthology.orchestra.run.vm03.stdout: sg3-utils sg3-utils-udev 2026-03-31T20:42:04.756 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-31T20:42:04.764 INFO:teuthology.orchestra.run.vm03.stdout:The following packages will be REMOVED: 2026-03-31T20:42:04.764 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr* ceph-mgr-dashboard* ceph-mgr-diskprediction-local* 2026-03-31T20:42:04.764 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-k8sevents* 2026-03-31T20:42:04.903 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 4 to remove and 50 not upgraded. 2026-03-31T20:42:04.903 INFO:teuthology.orchestra.run.vm03.stdout:After this operation, 219 MB disk space will be freed. 2026-03-31T20:42:04.937 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 126054 files and directories currently installed.) 2026-03-31T20:42:04.939 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-mgr-k8sevents (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:04.949 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-mgr-diskprediction-local (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:04.965 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-diskprediction-local, directory '/usr/share/ceph/mgr/diskprediction_local' not empty so not removed 2026-03-31T20:42:04.974 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-mgr-dashboard (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:05.033 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-dashboard, directory '/usr/share/ceph/mgr/dashboard/services/auth' not empty so not removed 2026-03-31T20:42:05.033 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-dashboard, directory '/usr/share/ceph/mgr/dashboard/plugins' not empty so not removed 2026-03-31T20:42:05.033 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-dashboard, directory '/usr/share/ceph/mgr/dashboard/model' not empty so not removed 2026-03-31T20:42:05.033 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-dashboard, directory '/usr/share/ceph/mgr/dashboard/controllers' not empty so not removed 2026-03-31T20:42:05.033 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-dashboard, directory '/usr/share/ceph/mgr/dashboard/api' not empty so not removed 2026-03-31T20:42:05.042 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-mgr (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:05.486 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 124272 files and directories currently installed.) 2026-03-31T20:42:05.488 INFO:teuthology.orchestra.run.vm03.stdout:Purging configuration files for ceph-mgr (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:05.833 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr, directory '/var/lib/ceph/mgr' not empty so not removed 2026-03-31T20:42:06.686 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-31T20:42:06.718 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-31T20:42:06.841 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-31T20:42:06.841 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-31T20:42:06.935 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-31T20:42:06.935 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-31T20:42:06.935 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-31T20:42:06.935 INFO:teuthology.orchestra.run.vm03.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-31T20:42:06.935 INFO:teuthology.orchestra.run.vm03.stdout: python3-asyncssh python3-cachetools python3-ceph-common python3-cheroot 2026-03-31T20:42:06.935 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-31T20:42:06.935 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-31T20:42:06.935 INFO:teuthology.orchestra.run.vm03.stdout: python3-joblib python3-kubernetes python3-natsort python3-portend 2026-03-31T20:42:06.935 INFO:teuthology.orchestra.run.vm03.stdout: python3-prettytable python3-psutil python3-repoze.lru 2026-03-31T20:42:06.935 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplejson 2026-03-31T20:42:06.935 INFO:teuthology.orchestra.run.vm03.stdout: python3-sklearn python3-sklearn-lib python3-tempora python3-threadpoolctl 2026-03-31T20:42:06.935 INFO:teuthology.orchestra.run.vm03.stdout: python3-wcwidth python3-webob python3-websocket python3-zc.lockfile 2026-03-31T20:42:06.935 INFO:teuthology.orchestra.run.vm03.stdout: sg3-utils sg3-utils-udev smartmontools socat xmlstarlet 2026-03-31T20:42:06.935 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-31T20:42:06.943 INFO:teuthology.orchestra.run.vm03.stdout:The following packages will be REMOVED: 2026-03-31T20:42:06.943 INFO:teuthology.orchestra.run.vm03.stdout: ceph-base* ceph-common* ceph-mon* ceph-osd* ceph-test* ceph-volume* radosgw* 2026-03-31T20:42:07.083 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 7 to remove and 50 not upgraded. 2026-03-31T20:42:07.083 INFO:teuthology.orchestra.run.vm03.stdout:After this operation, 732 MB disk space will be freed. 2026-03-31T20:42:07.117 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 124271 files and directories currently installed.) 2026-03-31T20:42:07.118 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-volume (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:07.172 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-osd (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:07.544 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-mon (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:07.927 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-base (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:08.278 INFO:teuthology.orchestra.run.vm03.stdout:Removing radosgw (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:08.630 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-test (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:08.688 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-common (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:09.072 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-31T20:42:09.103 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-31T20:42:09.164 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123780 files and directories currently installed.) 2026-03-31T20:42:09.166 INFO:teuthology.orchestra.run.vm03.stdout:Purging configuration files for radosgw (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:09.699 INFO:teuthology.orchestra.run.vm03.stdout:Purging configuration files for ceph-mon (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:10.061 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mon, directory '/var/lib/ceph/mon' not empty so not removed 2026-03-31T20:42:10.069 INFO:teuthology.orchestra.run.vm03.stdout:Purging configuration files for ceph-base (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:10.435 INFO:teuthology.orchestra.run.vm03.stdout:Purging configuration files for ceph-common (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:10.850 INFO:teuthology.orchestra.run.vm03.stdout:Purging configuration files for ceph-osd (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:11.206 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-osd, directory '/var/lib/ceph/osd' not empty so not removed 2026-03-31T20:42:12.046 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-31T20:42:12.079 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-31T20:42:12.202 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-31T20:42:12.203 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-31T20:42:12.297 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-31T20:42:12.297 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-31T20:42:12.297 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-31T20:42:12.297 INFO:teuthology.orchestra.run.vm03.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-31T20:42:12.297 INFO:teuthology.orchestra.run.vm03.stdout: python3-asyncssh python3-cachetools python3-ceph-common python3-cheroot 2026-03-31T20:42:12.297 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-31T20:42:12.297 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-31T20:42:12.298 INFO:teuthology.orchestra.run.vm03.stdout: python3-joblib python3-kubernetes python3-natsort python3-portend 2026-03-31T20:42:12.298 INFO:teuthology.orchestra.run.vm03.stdout: python3-prettytable python3-psutil python3-repoze.lru 2026-03-31T20:42:12.298 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplejson 2026-03-31T20:42:12.298 INFO:teuthology.orchestra.run.vm03.stdout: python3-sklearn python3-sklearn-lib python3-tempora python3-threadpoolctl 2026-03-31T20:42:12.298 INFO:teuthology.orchestra.run.vm03.stdout: python3-wcwidth python3-webob python3-websocket python3-zc.lockfile 2026-03-31T20:42:12.298 INFO:teuthology.orchestra.run.vm03.stdout: sg3-utils sg3-utils-udev smartmontools socat xmlstarlet 2026-03-31T20:42:12.298 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-31T20:42:12.305 INFO:teuthology.orchestra.run.vm03.stdout:The following packages will be REMOVED: 2026-03-31T20:42:12.305 INFO:teuthology.orchestra.run.vm03.stdout: ceph-fuse* 2026-03-31T20:42:12.448 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 1 to remove and 50 not upgraded. 2026-03-31T20:42:12.448 INFO:teuthology.orchestra.run.vm03.stdout:After this operation, 2932 kB disk space will be freed. 2026-03-31T20:42:12.481 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123764 files and directories currently installed.) 2026-03-31T20:42:12.483 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-fuse (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:12.853 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-31T20:42:12.933 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123755 files and directories currently installed.) 2026-03-31T20:42:12.935 INFO:teuthology.orchestra.run.vm03.stdout:Purging configuration files for ceph-fuse (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:14.091 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-31T20:42:14.123 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-31T20:42:14.243 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-31T20:42:14.243 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-31T20:42:14.333 INFO:teuthology.orchestra.run.vm03.stdout:Package 'ceph-test' is not installed, so not removed 2026-03-31T20:42:14.333 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-31T20:42:14.333 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-31T20:42:14.333 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-31T20:42:14.333 INFO:teuthology.orchestra.run.vm03.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-31T20:42:14.333 INFO:teuthology.orchestra.run.vm03.stdout: python3-asyncssh python3-cachetools python3-ceph-common python3-cheroot 2026-03-31T20:42:14.333 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-31T20:42:14.333 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-31T20:42:14.333 INFO:teuthology.orchestra.run.vm03.stdout: python3-joblib python3-kubernetes python3-natsort python3-portend 2026-03-31T20:42:14.333 INFO:teuthology.orchestra.run.vm03.stdout: python3-prettytable python3-psutil python3-repoze.lru 2026-03-31T20:42:14.333 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplejson 2026-03-31T20:42:14.333 INFO:teuthology.orchestra.run.vm03.stdout: python3-sklearn python3-sklearn-lib python3-tempora python3-threadpoolctl 2026-03-31T20:42:14.333 INFO:teuthology.orchestra.run.vm03.stdout: python3-wcwidth python3-webob python3-websocket python3-zc.lockfile 2026-03-31T20:42:14.333 INFO:teuthology.orchestra.run.vm03.stdout: sg3-utils sg3-utils-udev smartmontools socat xmlstarlet 2026-03-31T20:42:14.333 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-31T20:42:14.347 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 0 to remove and 50 not upgraded. 2026-03-31T20:42:14.347 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-31T20:42:14.379 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-31T20:42:14.502 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-31T20:42:14.503 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-31T20:42:14.597 INFO:teuthology.orchestra.run.vm03.stdout:Package 'ceph-volume' is not installed, so not removed 2026-03-31T20:42:14.597 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-31T20:42:14.597 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-31T20:42:14.597 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-31T20:42:14.597 INFO:teuthology.orchestra.run.vm03.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-31T20:42:14.597 INFO:teuthology.orchestra.run.vm03.stdout: python3-asyncssh python3-cachetools python3-ceph-common python3-cheroot 2026-03-31T20:42:14.597 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-31T20:42:14.597 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-31T20:42:14.597 INFO:teuthology.orchestra.run.vm03.stdout: python3-joblib python3-kubernetes python3-natsort python3-portend 2026-03-31T20:42:14.597 INFO:teuthology.orchestra.run.vm03.stdout: python3-prettytable python3-psutil python3-repoze.lru 2026-03-31T20:42:14.597 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplejson 2026-03-31T20:42:14.597 INFO:teuthology.orchestra.run.vm03.stdout: python3-sklearn python3-sklearn-lib python3-tempora python3-threadpoolctl 2026-03-31T20:42:14.597 INFO:teuthology.orchestra.run.vm03.stdout: python3-wcwidth python3-webob python3-websocket python3-zc.lockfile 2026-03-31T20:42:14.597 INFO:teuthology.orchestra.run.vm03.stdout: sg3-utils sg3-utils-udev smartmontools socat xmlstarlet 2026-03-31T20:42:14.597 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-31T20:42:14.611 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 0 to remove and 50 not upgraded. 2026-03-31T20:42:14.611 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-31T20:42:14.643 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-31T20:42:14.768 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-31T20:42:14.769 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-31T20:42:14.863 INFO:teuthology.orchestra.run.vm03.stdout:Package 'radosgw' is not installed, so not removed 2026-03-31T20:42:14.863 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-31T20:42:14.863 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-31T20:42:14.863 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-31T20:42:14.863 INFO:teuthology.orchestra.run.vm03.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-31T20:42:14.863 INFO:teuthology.orchestra.run.vm03.stdout: python3-asyncssh python3-cachetools python3-ceph-common python3-cheroot 2026-03-31T20:42:14.863 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-31T20:42:14.863 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-31T20:42:14.863 INFO:teuthology.orchestra.run.vm03.stdout: python3-joblib python3-kubernetes python3-natsort python3-portend 2026-03-31T20:42:14.863 INFO:teuthology.orchestra.run.vm03.stdout: python3-prettytable python3-psutil python3-repoze.lru 2026-03-31T20:42:14.863 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplejson 2026-03-31T20:42:14.863 INFO:teuthology.orchestra.run.vm03.stdout: python3-sklearn python3-sklearn-lib python3-tempora python3-threadpoolctl 2026-03-31T20:42:14.863 INFO:teuthology.orchestra.run.vm03.stdout: python3-wcwidth python3-webob python3-websocket python3-zc.lockfile 2026-03-31T20:42:14.863 INFO:teuthology.orchestra.run.vm03.stdout: sg3-utils sg3-utils-udev smartmontools socat xmlstarlet 2026-03-31T20:42:14.863 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-31T20:42:14.877 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 0 to remove and 50 not upgraded. 2026-03-31T20:42:14.877 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-31T20:42:14.909 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-31T20:42:15.032 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-31T20:42:15.032 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-31T20:42:15.127 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-31T20:42:15.127 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-31T20:42:15.127 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-31T20:42:15.127 INFO:teuthology.orchestra.run.vm03.stdout: librdkafka1 librgw2 libsgutils2-2 libsqlite3-mod-ceph nvme-cli 2026-03-31T20:42:15.127 INFO:teuthology.orchestra.run.vm03.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-31T20:42:15.127 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-31T20:42:15.127 INFO:teuthology.orchestra.run.vm03.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-31T20:42:15.127 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-31T20:42:15.127 INFO:teuthology.orchestra.run.vm03.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-31T20:42:15.127 INFO:teuthology.orchestra.run.vm03.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-31T20:42:15.127 INFO:teuthology.orchestra.run.vm03.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-31T20:42:15.127 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-31T20:42:15.127 INFO:teuthology.orchestra.run.vm03.stdout: python3-websocket python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools 2026-03-31T20:42:15.127 INFO:teuthology.orchestra.run.vm03.stdout: socat xmlstarlet 2026-03-31T20:42:15.127 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-31T20:42:15.135 INFO:teuthology.orchestra.run.vm03.stdout:The following packages will be REMOVED: 2026-03-31T20:42:15.135 INFO:teuthology.orchestra.run.vm03.stdout: python3-cephfs* python3-rados* python3-rgw* 2026-03-31T20:42:15.273 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 3 to remove and 50 not upgraded. 2026-03-31T20:42:15.274 INFO:teuthology.orchestra.run.vm03.stdout:After this operation, 2086 kB disk space will be freed. 2026-03-31T20:42:15.306 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123755 files and directories currently installed.) 2026-03-31T20:42:15.307 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-cephfs (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:15.319 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-rgw (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:15.328 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-rados (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:16.168 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-31T20:42:16.201 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-31T20:42:16.323 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-31T20:42:16.323 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-31T20:42:16.417 INFO:teuthology.orchestra.run.vm03.stdout:Package 'python3-rgw' is not installed, so not removed 2026-03-31T20:42:16.417 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-31T20:42:16.417 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-31T20:42:16.417 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-31T20:42:16.417 INFO:teuthology.orchestra.run.vm03.stdout: librdkafka1 librgw2 libsgutils2-2 libsqlite3-mod-ceph nvme-cli 2026-03-31T20:42:16.417 INFO:teuthology.orchestra.run.vm03.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-31T20:42:16.417 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-31T20:42:16.417 INFO:teuthology.orchestra.run.vm03.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-31T20:42:16.417 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-31T20:42:16.417 INFO:teuthology.orchestra.run.vm03.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-31T20:42:16.417 INFO:teuthology.orchestra.run.vm03.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-31T20:42:16.417 INFO:teuthology.orchestra.run.vm03.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-31T20:42:16.417 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-31T20:42:16.417 INFO:teuthology.orchestra.run.vm03.stdout: python3-websocket python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools 2026-03-31T20:42:16.417 INFO:teuthology.orchestra.run.vm03.stdout: socat xmlstarlet 2026-03-31T20:42:16.417 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-31T20:42:16.431 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 0 to remove and 50 not upgraded. 2026-03-31T20:42:16.432 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-31T20:42:16.463 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-31T20:42:16.588 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-31T20:42:16.588 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-31T20:42:16.680 INFO:teuthology.orchestra.run.vm03.stdout:Package 'python3-cephfs' is not installed, so not removed 2026-03-31T20:42:16.680 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-31T20:42:16.680 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-31T20:42:16.680 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-31T20:42:16.680 INFO:teuthology.orchestra.run.vm03.stdout: librdkafka1 librgw2 libsgutils2-2 libsqlite3-mod-ceph nvme-cli 2026-03-31T20:42:16.680 INFO:teuthology.orchestra.run.vm03.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-31T20:42:16.680 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-31T20:42:16.680 INFO:teuthology.orchestra.run.vm03.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-31T20:42:16.680 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-31T20:42:16.680 INFO:teuthology.orchestra.run.vm03.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-31T20:42:16.680 INFO:teuthology.orchestra.run.vm03.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-31T20:42:16.680 INFO:teuthology.orchestra.run.vm03.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-31T20:42:16.680 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-31T20:42:16.680 INFO:teuthology.orchestra.run.vm03.stdout: python3-websocket python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools 2026-03-31T20:42:16.680 INFO:teuthology.orchestra.run.vm03.stdout: socat xmlstarlet 2026-03-31T20:42:16.680 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-31T20:42:16.694 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 0 to remove and 50 not upgraded. 2026-03-31T20:42:16.694 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-31T20:42:16.726 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-31T20:42:16.851 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-31T20:42:16.851 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-31T20:42:16.946 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-31T20:42:16.946 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-31T20:42:16.946 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-31T20:42:16.946 INFO:teuthology.orchestra.run.vm03.stdout: librdkafka1 librgw2 libsgutils2-2 libsqlite3-mod-ceph nvme-cli 2026-03-31T20:42:16.946 INFO:teuthology.orchestra.run.vm03.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-31T20:42:16.946 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-31T20:42:16.946 INFO:teuthology.orchestra.run.vm03.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-31T20:42:16.946 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-31T20:42:16.946 INFO:teuthology.orchestra.run.vm03.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-31T20:42:16.946 INFO:teuthology.orchestra.run.vm03.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-31T20:42:16.946 INFO:teuthology.orchestra.run.vm03.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-31T20:42:16.946 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-31T20:42:16.946 INFO:teuthology.orchestra.run.vm03.stdout: python3-websocket python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools 2026-03-31T20:42:16.946 INFO:teuthology.orchestra.run.vm03.stdout: socat xmlstarlet 2026-03-31T20:42:16.946 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-31T20:42:16.954 INFO:teuthology.orchestra.run.vm03.stdout:The following packages will be REMOVED: 2026-03-31T20:42:16.954 INFO:teuthology.orchestra.run.vm03.stdout: python3-rbd* 2026-03-31T20:42:17.094 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 1 to remove and 50 not upgraded. 2026-03-31T20:42:17.094 INFO:teuthology.orchestra.run.vm03.stdout:After this operation, 1205 kB disk space will be freed. 2026-03-31T20:42:17.130 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123731 files and directories currently installed.) 2026-03-31T20:42:17.131 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-rbd (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:17.961 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-31T20:42:17.993 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-31T20:42:18.117 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-31T20:42:18.118 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-31T20:42:18.214 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-31T20:42:18.214 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-31T20:42:18.214 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libcephfs-proxy2 libjq1 liboath0 libonig5 libpmemobj1 2026-03-31T20:42:18.214 INFO:teuthology.orchestra.run.vm03.stdout: libradosstriper1 librdkafka1 librgw2 libsgutils2-2 libsqlite3-mod-ceph 2026-03-31T20:42:18.214 INFO:teuthology.orchestra.run.vm03.stdout: nvme-cli python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-31T20:42:18.214 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-31T20:42:18.214 INFO:teuthology.orchestra.run.vm03.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-31T20:42:18.214 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-31T20:42:18.214 INFO:teuthology.orchestra.run.vm03.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-31T20:42:18.214 INFO:teuthology.orchestra.run.vm03.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-31T20:42:18.214 INFO:teuthology.orchestra.run.vm03.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-31T20:42:18.214 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-31T20:42:18.214 INFO:teuthology.orchestra.run.vm03.stdout: python3-websocket python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools 2026-03-31T20:42:18.214 INFO:teuthology.orchestra.run.vm03.stdout: socat xmlstarlet 2026-03-31T20:42:18.214 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-31T20:42:18.221 INFO:teuthology.orchestra.run.vm03.stdout:The following packages will be REMOVED: 2026-03-31T20:42:18.222 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs-daemon* libcephfs-dev* libcephfs2* 2026-03-31T20:42:18.361 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 3 to remove and 50 not upgraded. 2026-03-31T20:42:18.361 INFO:teuthology.orchestra.run.vm03.stdout:After this operation, 2851 kB disk space will be freed. 2026-03-31T20:42:18.394 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123723 files and directories currently installed.) 2026-03-31T20:42:18.396 INFO:teuthology.orchestra.run.vm03.stdout:Removing libcephfs-daemon (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:18.407 INFO:teuthology.orchestra.run.vm03.stdout:Removing libcephfs-dev (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:18.416 INFO:teuthology.orchestra.run.vm03.stdout:Removing libcephfs2 (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:18.439 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-31T20:42:19.264 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-31T20:42:19.296 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-31T20:42:19.419 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-31T20:42:19.419 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-31T20:42:19.515 INFO:teuthology.orchestra.run.vm03.stdout:Package 'libcephfs-dev' is not installed, so not removed 2026-03-31T20:42:19.515 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-31T20:42:19.516 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-31T20:42:19.516 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libcephfs-proxy2 libjq1 liboath0 libonig5 libpmemobj1 2026-03-31T20:42:19.516 INFO:teuthology.orchestra.run.vm03.stdout: libradosstriper1 librdkafka1 librgw2 libsgutils2-2 libsqlite3-mod-ceph 2026-03-31T20:42:19.516 INFO:teuthology.orchestra.run.vm03.stdout: nvme-cli python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-31T20:42:19.516 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-31T20:42:19.516 INFO:teuthology.orchestra.run.vm03.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-31T20:42:19.516 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-31T20:42:19.516 INFO:teuthology.orchestra.run.vm03.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-31T20:42:19.516 INFO:teuthology.orchestra.run.vm03.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-31T20:42:19.516 INFO:teuthology.orchestra.run.vm03.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-31T20:42:19.516 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-31T20:42:19.516 INFO:teuthology.orchestra.run.vm03.stdout: python3-websocket python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools 2026-03-31T20:42:19.516 INFO:teuthology.orchestra.run.vm03.stdout: socat xmlstarlet 2026-03-31T20:42:19.516 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-31T20:42:19.530 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 0 to remove and 50 not upgraded. 2026-03-31T20:42:19.530 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-31T20:42:19.563 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-31T20:42:19.686 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-31T20:42:19.686 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-31T20:42:19.782 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-31T20:42:19.782 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-31T20:42:19.783 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libcephfs-proxy2 libdouble-conversion3 libfuse2 2026-03-31T20:42:19.783 INFO:teuthology.orchestra.run.vm03.stdout: libgfapi0 libgfrpc0 libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 2026-03-31T20:42:19.783 INFO:teuthology.orchestra.run.vm03.stdout: libnbd0 liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-31T20:42:19.783 INFO:teuthology.orchestra.run.vm03.stdout: libqt5network5 librdkafka1 libsgutils2-2 libthrift-0.16.0 nvme-cli 2026-03-31T20:42:19.783 INFO:teuthology.orchestra.run.vm03.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-31T20:42:19.783 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-31T20:42:19.783 INFO:teuthology.orchestra.run.vm03.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-31T20:42:19.783 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-31T20:42:19.783 INFO:teuthology.orchestra.run.vm03.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-31T20:42:19.783 INFO:teuthology.orchestra.run.vm03.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-31T20:42:19.783 INFO:teuthology.orchestra.run.vm03.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-31T20:42:19.783 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-31T20:42:19.783 INFO:teuthology.orchestra.run.vm03.stdout: python3-websocket python3-zc.lockfile qttranslations5-l10n sg3-utils 2026-03-31T20:42:19.783 INFO:teuthology.orchestra.run.vm03.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-31T20:42:19.783 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-31T20:42:19.790 INFO:teuthology.orchestra.run.vm03.stdout:The following packages will be REMOVED: 2026-03-31T20:42:19.790 INFO:teuthology.orchestra.run.vm03.stdout: librados2* libradosstriper1* librbd1* librgw2* libsqlite3-mod-ceph* 2026-03-31T20:42:19.791 INFO:teuthology.orchestra.run.vm03.stdout: qemu-block-extra* rbd-fuse* 2026-03-31T20:42:19.928 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 7 to remove and 50 not upgraded. 2026-03-31T20:42:19.928 INFO:teuthology.orchestra.run.vm03.stdout:After this operation, 59.2 MB disk space will be freed. 2026-03-31T20:42:19.960 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123701 files and directories currently installed.) 2026-03-31T20:42:19.962 INFO:teuthology.orchestra.run.vm03.stdout:Removing rbd-fuse (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:19.973 INFO:teuthology.orchestra.run.vm03.stdout:Removing libsqlite3-mod-ceph (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:19.983 INFO:teuthology.orchestra.run.vm03.stdout:Removing libradosstriper1 (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:19.992 INFO:teuthology.orchestra.run.vm03.stdout:Removing qemu-block-extra (1:6.2+dfsg-2ubuntu6.28) ... 2026-03-31T20:42:20.340 INFO:teuthology.orchestra.run.vm03.stdout:Removing librbd1 (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:20.352 INFO:teuthology.orchestra.run.vm03.stdout:Removing librgw2 (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:20.366 INFO:teuthology.orchestra.run.vm03.stdout:Removing librados2 (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:20.391 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-31T20:42:20.423 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-31T20:42:20.484 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123650 files and directories currently installed.) 2026-03-31T20:42:20.486 INFO:teuthology.orchestra.run.vm03.stdout:Purging configuration files for qemu-block-extra (1:6.2+dfsg-2ubuntu6.28) ... 2026-03-31T20:42:21.651 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-31T20:42:21.684 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-31T20:42:21.804 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-31T20:42:21.805 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-31T20:42:21.895 INFO:teuthology.orchestra.run.vm03.stdout:Package 'librbd1' is not installed, so not removed 2026-03-31T20:42:21.895 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-31T20:42:21.895 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-31T20:42:21.895 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libcephfs-proxy2 libdouble-conversion3 libfuse2 2026-03-31T20:42:21.895 INFO:teuthology.orchestra.run.vm03.stdout: libgfapi0 libgfrpc0 libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 2026-03-31T20:42:21.895 INFO:teuthology.orchestra.run.vm03.stdout: libnbd0 liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-31T20:42:21.896 INFO:teuthology.orchestra.run.vm03.stdout: libqt5network5 librdkafka1 libsgutils2-2 libthrift-0.16.0 nvme-cli 2026-03-31T20:42:21.896 INFO:teuthology.orchestra.run.vm03.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-31T20:42:21.896 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-31T20:42:21.896 INFO:teuthology.orchestra.run.vm03.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-31T20:42:21.896 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-31T20:42:21.896 INFO:teuthology.orchestra.run.vm03.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-31T20:42:21.896 INFO:teuthology.orchestra.run.vm03.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-31T20:42:21.896 INFO:teuthology.orchestra.run.vm03.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-31T20:42:21.896 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-31T20:42:21.896 INFO:teuthology.orchestra.run.vm03.stdout: python3-websocket python3-zc.lockfile qttranslations5-l10n sg3-utils 2026-03-31T20:42:21.896 INFO:teuthology.orchestra.run.vm03.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-31T20:42:21.896 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-31T20:42:21.909 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 0 to remove and 50 not upgraded. 2026-03-31T20:42:21.910 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-31T20:42:21.941 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-31T20:42:22.064 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-31T20:42:22.065 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-31T20:42:22.159 INFO:teuthology.orchestra.run.vm03.stdout:Package 'rbd-fuse' is not installed, so not removed 2026-03-31T20:42:22.159 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-31T20:42:22.159 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-31T20:42:22.159 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libcephfs-proxy2 libdouble-conversion3 libfuse2 2026-03-31T20:42:22.159 INFO:teuthology.orchestra.run.vm03.stdout: libgfapi0 libgfrpc0 libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 2026-03-31T20:42:22.159 INFO:teuthology.orchestra.run.vm03.stdout: libnbd0 liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-31T20:42:22.159 INFO:teuthology.orchestra.run.vm03.stdout: libqt5network5 librdkafka1 libsgutils2-2 libthrift-0.16.0 nvme-cli 2026-03-31T20:42:22.159 INFO:teuthology.orchestra.run.vm03.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-31T20:42:22.159 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-31T20:42:22.159 INFO:teuthology.orchestra.run.vm03.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-31T20:42:22.159 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-31T20:42:22.159 INFO:teuthology.orchestra.run.vm03.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-31T20:42:22.159 INFO:teuthology.orchestra.run.vm03.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-31T20:42:22.159 INFO:teuthology.orchestra.run.vm03.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-31T20:42:22.159 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-31T20:42:22.159 INFO:teuthology.orchestra.run.vm03.stdout: python3-websocket python3-zc.lockfile qttranslations5-l10n sg3-utils 2026-03-31T20:42:22.159 INFO:teuthology.orchestra.run.vm03.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-31T20:42:22.159 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-31T20:42:22.173 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 0 to remove and 50 not upgraded. 2026-03-31T20:42:22.173 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-31T20:42:22.175 DEBUG:teuthology.orchestra.run.vm03:> dpkg -l | grep '^.\(U\|H\)R' | awk '{print $2}' | sudo xargs --no-run-if-empty dpkg -P --force-remove-reinstreq 2026-03-31T20:42:22.227 DEBUG:teuthology.orchestra.run.vm03:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" autoremove 2026-03-31T20:42:22.299 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-31T20:42:22.429 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-31T20:42:22.430 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-31T20:42:22.529 INFO:teuthology.orchestra.run.vm03.stdout:The following packages will be REMOVED: 2026-03-31T20:42:22.530 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-31T20:42:22.530 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libcephfs-proxy2 libdouble-conversion3 libfuse2 2026-03-31T20:42:22.530 INFO:teuthology.orchestra.run.vm03.stdout: libgfapi0 libgfrpc0 libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 2026-03-31T20:42:22.530 INFO:teuthology.orchestra.run.vm03.stdout: libnbd0 liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-31T20:42:22.530 INFO:teuthology.orchestra.run.vm03.stdout: libqt5network5 librdkafka1 libsgutils2-2 libthrift-0.16.0 nvme-cli 2026-03-31T20:42:22.530 INFO:teuthology.orchestra.run.vm03.stdout: python-asyncssh-doc python3-asyncssh python3-cachetools 2026-03-31T20:42:22.530 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-argparse python3-ceph-common python3-cheroot python3-cherrypy3 2026-03-31T20:42:22.530 INFO:teuthology.orchestra.run.vm03.stdout: python3-google-auth python3-jaraco.classes python3-jaraco.collections 2026-03-31T20:42:22.530 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.functools python3-jaraco.text python3-joblib 2026-03-31T20:42:22.530 INFO:teuthology.orchestra.run.vm03.stdout: python3-kubernetes python3-natsort python3-portend python3-prettytable 2026-03-31T20:42:22.530 INFO:teuthology.orchestra.run.vm03.stdout: python3-psutil python3-repoze.lru python3-requests-oauthlib python3-routes 2026-03-31T20:42:22.530 INFO:teuthology.orchestra.run.vm03.stdout: python3-rsa python3-simplejson python3-sklearn python3-sklearn-lib 2026-03-31T20:42:22.530 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora python3-threadpoolctl python3-wcwidth python3-webob 2026-03-31T20:42:22.530 INFO:teuthology.orchestra.run.vm03.stdout: python3-websocket python3-zc.lockfile qttranslations5-l10n sg3-utils 2026-03-31T20:42:22.530 INFO:teuthology.orchestra.run.vm03.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-31T20:42:22.667 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 64 to remove and 50 not upgraded. 2026-03-31T20:42:22.667 INFO:teuthology.orchestra.run.vm03.stdout:After this operation, 96.8 MB disk space will be freed. 2026-03-31T20:42:22.700 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 123650 files and directories currently installed.) 2026-03-31T20:42:22.702 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-mgr-modules-core (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:22.711 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/volumes/fs/operations/versions' not empty so not removed 2026-03-31T20:42:22.711 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/test_orchestrator' not empty so not removed 2026-03-31T20:42:22.711 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/telemetry' not empty so not removed 2026-03-31T20:42:22.711 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/telegraf' not empty so not removed 2026-03-31T20:42:22.711 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/status' not empty so not removed 2026-03-31T20:42:22.712 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/stats/fs' not empty so not removed 2026-03-31T20:42:22.712 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/snap_schedule/fs' not empty so not removed 2026-03-31T20:42:22.712 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/selftest' not empty so not removed 2026-03-31T20:42:22.712 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/rgw' not empty so not removed 2026-03-31T20:42:22.712 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/rbd_support' not empty so not removed 2026-03-31T20:42:22.712 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/prometheus' not empty so not removed 2026-03-31T20:42:22.712 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/progress' not empty so not removed 2026-03-31T20:42:22.712 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/pg_autoscaler' not empty so not removed 2026-03-31T20:42:22.712 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/osd_support' not empty so not removed 2026-03-31T20:42:22.712 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/osd_perf_query' not empty so not removed 2026-03-31T20:42:22.712 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/orchestrator' not empty so not removed 2026-03-31T20:42:22.712 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/nfs' not empty so not removed 2026-03-31T20:42:22.712 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/mirroring/fs/dir_map' not empty so not removed 2026-03-31T20:42:22.712 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/localpool' not empty so not removed 2026-03-31T20:42:22.712 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/iostat' not empty so not removed 2026-03-31T20:42:22.712 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/insights' not empty so not removed 2026-03-31T20:42:22.712 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/influx' not empty so not removed 2026-03-31T20:42:22.712 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/devicehealth' not empty so not removed 2026-03-31T20:42:22.712 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/crash' not empty so not removed 2026-03-31T20:42:22.712 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/balancer' not empty so not removed 2026-03-31T20:42:22.712 INFO:teuthology.orchestra.run.vm03.stdout:dpkg: warning: while removing ceph-mgr-modules-core, directory '/usr/share/ceph/mgr/alerts' not empty so not removed 2026-03-31T20:42:22.716 INFO:teuthology.orchestra.run.vm03.stdout:Removing jq (1.6-2.1ubuntu3.1) ... 2026-03-31T20:42:22.726 INFO:teuthology.orchestra.run.vm03.stdout:Removing kpartx (0.8.8-1ubuntu1.22.04.4) ... 2026-03-31T20:42:22.736 INFO:teuthology.orchestra.run.vm03.stdout:Removing libboost-iostreams1.74.0:amd64 (1.74.0-14ubuntu3) ... 2026-03-31T20:42:22.746 INFO:teuthology.orchestra.run.vm03.stdout:Removing libboost-thread1.74.0:amd64 (1.74.0-14ubuntu3) ... 2026-03-31T20:42:22.755 INFO:teuthology.orchestra.run.vm03.stdout:Removing libcephfs-proxy2 (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:22.764 INFO:teuthology.orchestra.run.vm03.stdout:Removing libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-31T20:42:22.773 INFO:teuthology.orchestra.run.vm03.stdout:Removing libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-31T20:42:22.782 INFO:teuthology.orchestra.run.vm03.stdout:Removing libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-31T20:42:22.791 INFO:teuthology.orchestra.run.vm03.stdout:Removing libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-31T20:42:22.810 INFO:teuthology.orchestra.run.vm03.stdout:Removing libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-31T20:42:22.819 INFO:teuthology.orchestra.run.vm03.stdout:Removing libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-31T20:42:22.828 INFO:teuthology.orchestra.run.vm03.stdout:Removing libgfapi0:amd64 (10.1-1ubuntu0.2) ... 2026-03-31T20:42:22.838 INFO:teuthology.orchestra.run.vm03.stdout:Removing libgfrpc0:amd64 (10.1-1ubuntu0.2) ... 2026-03-31T20:42:22.847 INFO:teuthology.orchestra.run.vm03.stdout:Removing libgfxdr0:amd64 (10.1-1ubuntu0.2) ... 2026-03-31T20:42:22.857 INFO:teuthology.orchestra.run.vm03.stdout:Removing libglusterfs0:amd64 (10.1-1ubuntu0.2) ... 2026-03-31T20:42:22.865 INFO:teuthology.orchestra.run.vm03.stdout:Removing libiscsi7:amd64 (1.19.0-3build2) ... 2026-03-31T20:42:22.874 INFO:teuthology.orchestra.run.vm03.stdout:Removing libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-31T20:42:22.883 INFO:teuthology.orchestra.run.vm03.stdout:Removing liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-31T20:42:22.893 INFO:teuthology.orchestra.run.vm03.stdout:Removing libnbd0 (1.10.5-1) ... 2026-03-31T20:42:22.901 INFO:teuthology.orchestra.run.vm03.stdout:Removing liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-31T20:42:22.910 INFO:teuthology.orchestra.run.vm03.stdout:Removing libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-31T20:42:22.919 INFO:teuthology.orchestra.run.vm03.stdout:Removing libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-31T20:42:22.929 INFO:teuthology.orchestra.run.vm03.stdout:Removing libpmemobj1:amd64 (1.11.1-3build1) ... 2026-03-31T20:42:22.938 INFO:teuthology.orchestra.run.vm03.stdout:Removing librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-31T20:42:22.947 INFO:teuthology.orchestra.run.vm03.stdout:Removing sg3-utils-udev (1.46-1ubuntu0.22.04.1) ... 2026-03-31T20:42:22.955 INFO:teuthology.orchestra.run.vm03.stdout:update-initramfs: deferring update (trigger activated) 2026-03-31T20:42:22.964 INFO:teuthology.orchestra.run.vm03.stdout:Removing sg3-utils (1.46-1ubuntu0.22.04.1) ... 2026-03-31T20:42:22.980 INFO:teuthology.orchestra.run.vm03.stdout:Removing libsgutils2-2:amd64 (1.46-1ubuntu0.22.04.1) ... 2026-03-31T20:42:22.989 INFO:teuthology.orchestra.run.vm03.stdout:Removing nvme-cli (1.16-3ubuntu0.3) ... 2026-03-31T20:42:23.322 INFO:teuthology.orchestra.run.vm03.stdout:Removing python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-31T20:42:23.336 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-31T20:42:23.388 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-31T20:42:23.630 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-google-auth (1.5.1-3) ... 2026-03-31T20:42:23.679 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-cachetools (5.0.0-1) ... 2026-03-31T20:42:23.729 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-ceph-argparse (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:23.775 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-ceph-common (20.2.0-721-g5bb32787-1jammy) ... 2026-03-31T20:42:23.825 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-cherrypy3 (18.6.1-4) ... 2026-03-31T20:42:23.887 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-cheroot (8.5.2+ds1-1ubuntu3.2) ... 2026-03-31T20:42:23.933 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-jaraco.collections (3.4.0-2) ... 2026-03-31T20:42:23.977 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-jaraco.classes (3.2.1-3) ... 2026-03-31T20:42:24.021 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-portend (3.0.0-1) ... 2026-03-31T20:42:24.066 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-tempora (4.1.2-1) ... 2026-03-31T20:42:24.110 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-jaraco.text (3.6.0-2) ... 2026-03-31T20:42:24.155 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-jaraco.functools (3.4.0-2) ... 2026-03-31T20:42:24.198 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-31T20:42:24.304 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-joblib (0.17.0-4ubuntu1) ... 2026-03-31T20:42:24.358 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-natsort (8.0.2-1) ... 2026-03-31T20:42:24.404 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-prettytable (2.5.0-2) ... 2026-03-31T20:42:24.449 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-psutil (5.9.0-1build1) ... 2026-03-31T20:42:24.495 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-routes (2.5.1-1ubuntu1) ... 2026-03-31T20:42:24.540 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-repoze.lru (0.7-2) ... 2026-03-31T20:42:24.584 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-31T20:42:24.629 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-rsa (4.8-1) ... 2026-03-31T20:42:24.674 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-simplejson (3.17.6-1build1) ... 2026-03-31T20:42:24.723 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-31T20:42:24.736 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-threadpoolctl (3.1.0-1) ... 2026-03-31T20:42:24.780 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-31T20:42:24.824 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-31T20:42:24.871 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-websocket (1.2.3-1) ... 2026-03-31T20:42:24.918 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-zc.lockfile (2.0-1) ... 2026-03-31T20:42:24.963 INFO:teuthology.orchestra.run.vm03.stdout:Removing qttranslations5-l10n (5.15.3-1) ... 2026-03-31T20:42:24.982 INFO:teuthology.orchestra.run.vm03.stdout:Removing smartmontools (7.2-1ubuntu0.1) ... 2026-03-31T20:42:25.358 INFO:teuthology.orchestra.run.vm03.stdout:Removing socat (1.7.4.1-3ubuntu4) ... 2026-03-31T20:42:25.370 INFO:teuthology.orchestra.run.vm03.stdout:Removing xmlstarlet (1.6.1-2.1) ... 2026-03-31T20:42:25.400 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-31T20:42:25.410 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-31T20:42:25.454 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for initramfs-tools (0.140ubuntu13.5) ... 2026-03-31T20:42:25.468 INFO:teuthology.orchestra.run.vm03.stdout:update-initramfs: Generating /boot/initrd.img-5.15.0-171-generic 2026-03-31T20:42:29.521 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-31T20:42:29.524 DEBUG:teuthology.parallel:result is None 2026-03-31T20:42:29.524 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm03.local 2026-03-31T20:42:29.524 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -f /etc/apt/sources.list.d/ceph.list 2026-03-31T20:42:29.574 DEBUG:teuthology.orchestra.run.vm03:> sudo apt-get update 2026-03-31T20:42:29.672 INFO:teuthology.orchestra.run.vm03.stdout:Hit:1 http://security.ubuntu.com/ubuntu jammy-security InRelease 2026-03-31T20:42:29.904 INFO:teuthology.orchestra.run.vm03.stdout:Hit:2 http://archive.ubuntu.com/ubuntu jammy InRelease 2026-03-31T20:42:30.014 INFO:teuthology.orchestra.run.vm03.stdout:Hit:3 http://archive.ubuntu.com/ubuntu jammy-updates InRelease 2026-03-31T20:42:30.123 INFO:teuthology.orchestra.run.vm03.stdout:Hit:4 http://archive.ubuntu.com/ubuntu jammy-backports InRelease 2026-03-31T20:42:30.764 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-31T20:42:30.777 DEBUG:teuthology.parallel:result is None 2026-03-31T20:42:30.777 DEBUG:teuthology.run_tasks:Unwinding manager clock 2026-03-31T20:42:30.779 INFO:teuthology.task.clock:Checking final clock skew... 2026-03-31T20:42:30.779 DEBUG:teuthology.orchestra.run.vm03:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-31T20:42:30.986 INFO:teuthology.orchestra.run.vm03.stdout: remote refid st t when poll reach delay offset jitter 2026-03-31T20:42:30.986 INFO:teuthology.orchestra.run.vm03.stdout:============================================================================== 2026-03-31T20:42:30.986 INFO:teuthology.orchestra.run.vm03.stdout: 0.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-31T20:42:30.986 INFO:teuthology.orchestra.run.vm03.stdout: 1.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-31T20:42:30.986 INFO:teuthology.orchestra.run.vm03.stdout: 2.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-31T20:42:30.986 INFO:teuthology.orchestra.run.vm03.stdout: 3.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-31T20:42:30.986 INFO:teuthology.orchestra.run.vm03.stdout: ntp.ubuntu.com .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-31T20:42:30.986 INFO:teuthology.orchestra.run.vm03.stdout:#141.84.43.73 171.237.1.87 2 u 31 64 377 31.736 -3.198 3.205 2026-03-31T20:42:30.986 INFO:teuthology.orchestra.run.vm03.stdout:-46.21.2.169.sta .PPS. 1 u 22 64 377 29.551 -3.035 0.201 2026-03-31T20:42:30.986 INFO:teuthology.orchestra.run.vm03.stdout:#172-236-195-26. 233.72.92.146 3 u 20 64 377 23.972 -2.590 0.441 2026-03-31T20:42:30.986 INFO:teuthology.orchestra.run.vm03.stdout:-cp.hypermediaa. 189.97.54.122 2 u 31 64 177 25.037 -2.621 0.507 2026-03-31T20:42:30.986 INFO:teuthology.orchestra.run.vm03.stdout:#ns8.starka.st 79.133.44.137 2 u 27 64 377 22.806 -4.028 0.294 2026-03-31T20:42:30.986 INFO:teuthology.orchestra.run.vm03.stdout:-static.179.181. 161.62.157.173 3 u 29 64 177 23.541 -2.049 0.507 2026-03-31T20:42:30.986 INFO:teuthology.orchestra.run.vm03.stdout:+vps-ber1.orlean 127.65.222.189 2 u 29 64 377 28.787 -1.054 0.620 2026-03-31T20:42:30.986 INFO:teuthology.orchestra.run.vm03.stdout:*79.133.44.136 .MBGh. 1 u 32 64 377 20.495 -1.729 1.002 2026-03-31T20:42:30.986 INFO:teuthology.orchestra.run.vm03.stdout:+static.215.156. 35.73.197.144 2 u 13 64 377 23.578 -1.725 0.265 2026-03-31T20:42:30.986 INFO:teuthology.orchestra.run.vm03.stdout:-81.3.27.46 (ntp 194.58.204.196 2 u 25 64 377 27.388 -2.145 0.917 2026-03-31T20:42:30.986 INFO:teuthology.orchestra.run.vm03.stdout:#185.125.190.56 146.131.121.246 2 u 65 64 377 35.745 -2.805 0.293 2026-03-31T20:42:30.986 INFO:teuthology.orchestra.run.vm03.stdout:+cluster020.lino 130.149.17.21 2 u 18 64 377 25.086 -1.616 0.453 2026-03-31T20:42:30.986 INFO:teuthology.orchestra.run.vm03.stdout:+time2.sebhostin 127.65.222.189 2 u 25 64 377 29.106 -1.601 0.494 2026-03-31T20:42:30.986 INFO:teuthology.orchestra.run.vm03.stdout:+185.125.190.58 99.220.8.133 2 u 50 64 377 31.235 -1.387 0.264 2026-03-31T20:42:30.987 DEBUG:teuthology.run_tasks:Unwinding manager ansible.cephlab 2026-03-31T20:42:30.988 INFO:teuthology.task.ansible:Skipping ansible cleanup... 2026-03-31T20:42:30.989 DEBUG:teuthology.run_tasks:Unwinding manager selinux 2026-03-31T20:42:30.990 DEBUG:teuthology.run_tasks:Unwinding manager pcp 2026-03-31T20:42:30.992 DEBUG:teuthology.run_tasks:Unwinding manager internal.timer 2026-03-31T20:42:30.994 INFO:teuthology.task.internal:Duration was 1426.806118 seconds 2026-03-31T20:42:30.994 DEBUG:teuthology.run_tasks:Unwinding manager internal.syslog 2026-03-31T20:42:30.996 INFO:teuthology.task.internal.syslog:Shutting down syslog monitoring... 2026-03-31T20:42:30.996 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-31T20:42:31.014 INFO:teuthology.task.internal.syslog:Checking logs for errors... 2026-03-31T20:42:31.014 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm03.local 2026-03-31T20:42:31.015 DEBUG:teuthology.orchestra.run.vm03:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-31T20:42:31.062 INFO:teuthology.task.internal.syslog:Gathering journactl... 2026-03-31T20:42:31.062 DEBUG:teuthology.orchestra.run.vm03:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-31T20:42:31.121 INFO:teuthology.task.internal.syslog:Compressing syslogs... 2026-03-31T20:42:31.122 DEBUG:teuthology.orchestra.run.vm03:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-31T20:42:31.169 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-31T20:42:31.170 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-31T20:42:31.170 INFO:teuthology.orchestra.run.vm03.stderr:/home/ubuntu/cephtest/archive/syslog/kern.log: gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-31T20:42:31.170 INFO:teuthology.orchestra.run.vm03.stderr: 2026-03-31T20:42:31.170 INFO:teuthology.orchestra.run.vm03.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-31T20:42:31.174 INFO:teuthology.orchestra.run.vm03.stderr:/home/ubuntu/cephtest/archive/syslog/journalctl.log: 86.3% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-31T20:42:31.175 DEBUG:teuthology.run_tasks:Unwinding manager internal.sudo 2026-03-31T20:42:31.177 INFO:teuthology.task.internal:Restoring /etc/sudoers... 2026-03-31T20:42:31.177 DEBUG:teuthology.orchestra.run.vm03:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-31T20:42:31.221 DEBUG:teuthology.run_tasks:Unwinding manager internal.coredump 2026-03-31T20:42:31.224 DEBUG:teuthology.orchestra.run.vm03:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-31T20:42:31.269 INFO:teuthology.orchestra.run.vm03.stdout:kernel.core_pattern = core 2026-03-31T20:42:31.275 DEBUG:teuthology.orchestra.run.vm03:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-31T20:42:31.322 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-31T20:42:31.322 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive 2026-03-31T20:42:31.324 INFO:teuthology.task.internal:Transferring archived files... 2026-03-31T20:42:31.324 DEBUG:teuthology.misc:Transferring archived files from vm03:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-31_11:18:10-rados-tentacle-none-default-vps/4344/remote/vm03 2026-03-31T20:42:31.324 DEBUG:teuthology.orchestra.run.vm03:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-31T20:42:31.370 INFO:teuthology.task.internal:Removing archive directory... 2026-03-31T20:42:31.370 DEBUG:teuthology.orchestra.run.vm03:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-31T20:42:31.372 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive_upload 2026-03-31T20:42:31.374 INFO:teuthology.task.internal:Not uploading archives. 2026-03-31T20:42:31.374 DEBUG:teuthology.run_tasks:Unwinding manager internal.base 2026-03-31T20:42:31.376 INFO:teuthology.task.internal:Tidying up after the test... 2026-03-31T20:42:31.377 DEBUG:teuthology.orchestra.run.vm03:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-31T20:42:31.418 INFO:teuthology.orchestra.run.vm03.stdout: 258067 4 drwxr-xr-x 2 ubuntu ubuntu 4096 Mar 31 20:42 /home/ubuntu/cephtest 2026-03-31T20:42:31.419 DEBUG:teuthology.run_tasks:Unwinding manager console_log 2026-03-31T20:42:31.424 INFO:teuthology.run:Summary data: description: rados/singleton-bluestore/{all/cephtool mon_election/connectivity msgr-failures/none msgr/async-v2only objectstore/bluestore/{alloc$/{avl} base mem$/{normal-2} onode-segment$/{1M} write$/{v2/{compr$/{yes$/{lz4}} v2}}} rados supported-random-distro$/{ubuntu_latest}} duration: 1426.8061184883118 flavor: default owner: kyr success: true 2026-03-31T20:42:31.424 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-31T20:42:31.442 INFO:teuthology.run:pass